Skip to content

Commit

Permalink
Merge pull request #591 from EpistasisLab/AliroGPT
Browse files Browse the repository at this point in the history
Aliro GPT:
Merging this PR. Unit tests for scikit-learn failing due to missing SVDRecommender in AliroGPT branch, but it is in the master branch, so tests will run on the master branch.
  • Loading branch information
jay-m-dev authored Aug 1, 2023
2 parents 3514df8 + da9857d commit a1dfa57
Show file tree
Hide file tree
Showing 158 changed files with 19,197 additions and 3,635 deletions.
2 changes: 1 addition & 1 deletion .env
Original file line number Diff line number Diff line change
Expand Up @@ -7,5 +7,5 @@
# Leave this set to 0 on the GitHub repo so the unit and
# integration tests do not need to have wheels (until we
# find a convenient way to use wheels on GitHub)
TAG=1.0a0
TAG=1.0.a0
USE_WHEELS=1
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -65,4 +65,5 @@ MANIFEST
*.mp4
package-lock.json
package.json
package-copy.json
package-copy.json
machine/code_runs/
36 changes: 36 additions & 0 deletions chatExamples.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
## Examples of Chat for the Iris_outlier Dataset

The following table shows some examples of chat for the iris dataset. The dataset is located at 'pmlb_small/dataset/data/AliroGPT.

| | Chat examples for the iris dataset (Classification) |
| --- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| 1 | "Remove any outliers present in each column, except for the column named 'class', using the z-score method. Then, please fill in any missing values within the columns, except for the column named 'class', by using the average method. Finally, save the modified dataset as a new file named 'new_df.tsv'. Please refrain from using the 'scipy' library to import 'stats'." |
| 2 | "Generate a scatter plot with sepal width on the x-axis and petal length on the y-axis. Use color differentiation to show the class column and save the resulting plot as a PNG image file." |
| 3 | "Create a 3D scatter plot with sepal width, petal length, and petal width values assigned to the x, y, and z axes, respectively. Use color to differentiate data points based on the "class" column. Please make sure to save the plot as a PNG file." |
| 4 | "Attempt to read the file 'nonexist.csv' and handle any resulting errors appropriately." |
| 5 | "Please show me box plots for each column in the df dataset, grouped by the "class" variable, excluding the class column. Additionally, please include only the column name on top of each figure without any other text information." |
| 6 | "Please drop the outliers in each column using the z-score method and save the modified dataset as a new file named 'new_df.tsv'." |

---

## Examples of Chat for the Breast Dataset

The following table shows some examples of chat for the breast dataset. [Click here](https://github.com/EpistasisLab/pmlb/tree/master/datasets/breast) to access the dataset on GitHub.

| | Chat examples for the breast dataset (Classification) |
| --- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| 1 | "Please use seaborn's kdeplot to show a distribution for each column in the dataframe, excluding the target column, with respect to the target column. Save each distribution plot as a PNG file." |
| 2 | "Please show me a 3D PCA plot for the df dataset with respect to the target column, and save it as a PNG file. Please note that StandardScaler should not be used." |
| 3 | "Please show me box plots for each column in the df dataset, grouped by the target variable, excluding the target column. Additionally, please include only the column name on top of each figure without any other text information." |
| 4 | "Please show me the correlation matrix between each column (excluding the target column) and save it as a PNG. Please use shortened column names, derived from the full column names, to label the axes on the correlation matrix figure." |

## Examples of Chat for the 1193_BNG_lowbwt_small.tsv Dataset

The following table shows some examples of chat for the breast dataset. The dataset is located at 'pmlb_small/dataset/data/AliroGPT'.

| | Chat examples for the 1193_BNG_lowbwt_small (Regression) |
| --- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| 1 | "Please show me boxplot for the inv-nodes column and save it as a png file." |
| 2 | "Please drop the rows in the DataFrame which contain outliers in the 'inv-nodes' column, and generate a box plot for the 'inv-nodes' column. The box plot image will be saved as 'inv-nodes-boxplot.png'." |
| 3 | "Please apply PCA to the 'df' dataframe and generate a 3D plot based on the 'target' output. Each scatter point in the 3D plot should be colored according to the relevant 'target' value. After generating the plot, save it as a PNG image and include a color bar. Please let me know if you have any questions." |
| 4 | "Please show me the correlation matrix between each column (excluding the target column) and save it as a PNG. Please use shortened column names, derived from the full column names, to label the axes on the correlation matrix figure." |
60 changes: 60 additions & 0 deletions chatapi.rest
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
POST http://localhost:5080/chatapi/v1/chats
Content-Type: application/json

{
"title" : "Chat with experiment id 2",
"_experiment_id": "63f6e4987c5f93004a3e3ca8",
"_dataset_id": "63f6e4947c5f93004a3e3ca7"
}

###

GET http://localhost:5080/chatapi/v1/chats

###

GET http://localhost:5080/chatapi/v1/chats/6411113943d572a8f55e5208

###

PATCH http://localhost:5080/chatapi/v1/chats/640bd7290674aa751483658b
Content-Type: application/json

{
"title" : "Chat with experiment id",
"_experiment_id": "63f6e4987c5f93004a3e3ca8",
"_dataset_id": "63f6e4947c5f93004a3e3ca7"
}

###

DELETE http://localhost:5080/chatapi/v1/chats/640bb89cf6a279429cf4ad7c

###
POST http://localhost:5080/chatapi/v1/chatlogs
Content-Type: application/json

{
"_chat_id" : "6411113943d572a8f55e5208",
"message" : "Hello there from my desk!",
"message_type" : "text",
"who" : "user"
}

###

PATCH http://localhost:5080/chatapi/v1/chatlogs/63f6e4947c5f93004a3e3ca7
Content-Type: application/json

{
"message" : "Hello from cyberspace!",
"message_type" : "text",
"who" : "openai"
}

###
GET http://localhost:5080/chatapi/v1/chats/experiment/63f6e4987c5f93004a3e3ca8

###

GET http://localhost:5080/chatapi/v1/chats/dataset/63f6e4947c5f93004a3e3ca7
4 changes: 4 additions & 0 deletions config/common.env
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,8 @@ DT_MAX_DEPTH=6
DOCKER_CLIENT_TIMEOUT=120
COMPOSE_HTTP_TIMEOUT=120

OPENAI_API_KEY=your_openai_api_key
OPENAI_ORG_ID=Personal

STARTUP_DATASET_PATH=/appsrc/data/datasets/user
CODE_RUN_PATH=/appsrc/machine/code_runs
151 changes: 151 additions & 0 deletions data/datasets/pmlb_small/iris/iris_outlier.csv
Original file line number Diff line number Diff line change
@@ -0,0 +1,151 @@
sepal-length,sepal-width,petal-length,petal-width,class
6.7,3.0,5.2,2.3,2
6.0,2.2,5.0,1.5,2
6.2,2.8,4.8,1.8,2
7.7,3.8,6.7,2.2,2
7.2,3.0,5.8,1.6,2
5.5,2.4,3.8,1.1,1
6.0,2.7,5.1,1.6,1
5.5,2.5,4.0,1.3,1
5.6,2.9,3.6,1.3,1
5.7,2.9,4.2,1.3,1
5.0,3.2,1.2,0.2,0
4.9,3.1,1.5,0.1,0
5.3,3.7,1.5,0.2,0
4.8,3.1,1.6,0.2,0
5.0,3.3,1.4,0.2,0
6.3,3.4,5.6,2.4,2
7.1,3.0,5.9,2.1,2
6.3,2.8,5.1,1.5,2
6.3,2.9,5.6,1.8,2
5.8,2.7,5.1,1.9,2
5.2,2.7,3.9,1.4,1
5.6,3.0,4.1,1.3,1
6.9,3.1,4.9,1.5,1
6.2,2.9,4.3,1.3,1
6.5,2.8,4.6,1.5,1
5.0,3.0,1.6,0.2,0
5.5,3.5,1.3,0.2,0
5.1,3.5,1.4,0.3,0
5.7,3.8,1.7,0.3,0
5.5,4.2,1.4,0.2,0
6.7,3.1,5.6,2.4,2
5.8,2.8,5.1,2.4,2
6.4,3.1,5.5,1.8,2
7.9,3.8,6.4,2.0,2
6.8,3.0,5.5,2.1,2
6.0,3.4,4.5,1.6,1
6.7,3.1,4.7,1.5,1
5.7,2.8,4.1,1.3,1
6.7,3.1,4.4,1.4,1
5.9,3.0,4.2,1.5,1
5.1,3.8,1.9,0.4,0
4.9,3.1,1.5,0.1,0
5.4,3.9,1.3,0.4,0
5.1,3.5,1.4,0.2,0
4.8,3.4,1.9,0.2,0
6.3,3.3,6.0,2.5,2
6.7,3.3,5.7,2.5,2
6.3,2.7,4.9,1.8,2
6.9,3.2,5.7,2.3,2
4.9,2.5,4.5,1.7,2
7.0,3.2,4.7,1.4,1
6.6,2.9,4.6,1.3,1
6.4,2.9,4.3,1.3,1
6.3,2.5,4.9,1.5,1
5.7,2.6,3.5,1.0,1
5.4,3.4,1.5,0.4,0
5.0,3.5,1.3,0.3,0
4.5,2.3,1.3,0.3,0
5.1,3.8,1.5,0.3,0
4.4,3.0,1.3,0.2,0
6.9,3.1,5.1,2.3,2
7.3,2.9,6.3,1.8,2
6.1,2.6,5.6,1.4,2
7.4,2.8,6.1,1.9,2
7.2,3.6,6.1,2.5,2
5.9,3.2,4.8,1.8,1
6.1,2.8,4.7,1.2,1
6.3,3.3,4.7,1.6,1
5.8,2.6,4.0,1.2,1
6.0,2.2,4.0,1.0,1
4.6,3.6,1.0,0.2,0
4.7,3.2,1.6,0.2,0
5.1,3.8,1.6,0.2,0
4.4,3.2,1.3,0.2,0
4.8,3.0,1.4,0.3,0
5.6,2.8,4.9,2.0,2
7.6,3.0,6.6,2.1,2
6.5,3.0,5.5,1.8,2
5.9,3.0,5.1,1.8,2
6.3,2.5,5.0,1.9,2
6.1,2.8,4.0,1.3,1
6.6,3.0,4.4,1.4,1
5.8,2.7,4.1,1.0,1
5.0,2.0,3.5,1.0,1
5.5,2.6,4.4,1.2,1
4.6,3.4,1.4,0.3,0
5.4,3.4,1.7,0.2,0
4.7,3.2,1.3,0.2,0
5.2,4.1,1.5,0.1,0
5.0,3.4,1.5,0.2,0
6.5,3.2,5.1,2.0,2
6.8,3.2,5.9,2.3,2
5.7,2.5,5.0,2.0,2
6.4,2.8,5.6,2.2,2
7.7,150.0,6.9,2.3,2
6.3,2.3,4.4,1.3,1
6.1,3.0,4.6,1.4,1
5.8,2.7,3.9,1.2,1
6.2,2.2,4.5,1.5,1
5.0,2.3,3.3,1.0,1
5.2,3.4,1.4,0.2,0
5.2,3.5,1.5,0.2,0
4.9,3.1,1.5,0.1,0
5.4,3.7,1.5,0.2,0
4.4,2.9,1.4,0.2,0
6.4,3.2,5.3,2.3,2
6.4,2.8,5.6,2.1,2
6.5,3.0,5.8,2.2,2
6.1,3.0,4.9,1.8,2
6.7,2.5,5.8,1.8,2
5.7,2.8,4.5,1.3,1
5.6,2.7,4.2,1.3,1
6.0,2.9,4.5,1.5,1
5.5,2.3,4.0,1.3,1
5.6,3.0,4.5,1.5,1
5.1,3.3,1.7,0.5,0
5.8,4.0,1.2,0.2,0
4.6,3.1,1.5,0.2,0
5.7,4.4,1.5,0.4,0
4.9,3.0,1.4,0.2,0
6.7,3.3,5.7,2.1,2
6.4,2.7,5.3,1.9,2
5.8,2.7,5.1,1.9,2
6.2,3.4,5.4,2.3,2
6.0,3.0,4.8,1.8,2
5.6,2.5,3.9,1.1,1
4.9,2.4,3.3,1.0,1
5.1,2.5,3.0,1.1,1
6.4,3.2,4.5,1.5,1
5.7,3.0,4.2,1.2,1
4.3,3.0,1.1,0.1,0
5.1,3.7,1.5,0.4,0
4.8,3.4,1.6,0.2,0
4.6,3.2,1.4,0.2,0
4.8,3.0,1.4,0.1,0
6.9,3.1,5.4,2.1,2
7.2,3.2,6.0,1.8,2
7.7,3.0,6.1,2.3,2
7.7,2.8,6.7,2.0,2
6.5,3.0,5.2,2.0,2
5.5,2.4,3.7,1.0,1
5.4,3.0,4.5,1.5,1
6.8,2.8,4.8,1.4,1
6.1,2.9,4.7,1.4,1
6.7,3.0,5.0,1.7,1
5.0,3.5,1.6,0.6,0
5.4,3.9,1.7,0.4,0
5.1,3.4,1.5,0.2,0
5.0,3.6,1.4,0.2,0
5.0,3.4,1.6,0.4,0

This file was deleted.

This file was deleted.

7 changes: 5 additions & 2 deletions docker-compose-doc-builder.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,10 @@ services:

doc_api_builder:
image: "node:18.13.0-slim"
command: bash -c "npm i -g raml2html && raml2html /appsrc/lab/api.raml >
/appsrc/target/ai_docs/html/lab_api_source.html"
command: bash -c "npm i -g raml2html &&
raml2html /appsrc/lab/api.raml > /appsrc/target/ai_docs/html/lab_api_source.html &&
raml2html /appsrc/docs/APIs/openai.raml > /appsrc/target/ai_docs/html/openai_source.html &&
raml2html /appsrc/docs/APIs/chatapi.raml > /appsrc/target/ai_docs/html/chatapi_source.html &&
raml2html /appsrc/docs/APIs/execapi.raml > /appsrc/target/ai_docs/html/execapi_source.html"
volumes:
- "./:/appsrc"
2 changes: 2 additions & 0 deletions docker-compose-int-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ services:
build:
context: .
dockerfile: docker/lab/Dockerfile
target: dev
args:
# - USE_WHEELS=${USE_WHEELS}
- USE_WHEELS=0
Expand All @@ -47,6 +48,7 @@ services:
build:
context: .
dockerfile: docker/machine/Dockerfile
target: dev
args:
# - USE_WHEELS=${USE_WHEELS}
- USE_WHEELS=0
Expand Down
21 changes: 21 additions & 0 deletions docker-compose-multi-machine.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,9 @@ services:
build:
context: .
dockerfile: docker/lab/Dockerfile
target: ${DOCKER_BUILD_ENV}
args:
- USE_WHEELS=${USE_WHEELS}
tty: true
stdin_open: true
volumes:
Expand All @@ -24,6 +27,9 @@ services:
build:
context: .
dockerfile: docker/machine/Dockerfile
target: ${DOCKER_BUILD_ENV}
args:
- USE_WHEELS=${USE_WHEELS}
tty: true
stdin_open: true
volumes:
Expand All @@ -43,6 +49,9 @@ services:
build:
context: .
dockerfile: docker/machine/Dockerfile
target: ${DOCKER_BUILD_ENV}
args:
- USE_WHEELS=${USE_WHEELS}
tty: true
stdin_open: true
volumes:
Expand All @@ -62,6 +71,9 @@ services:
build:
context: .
dockerfile: docker/machine/Dockerfile
target: ${DOCKER_BUILD_ENV}
args:
- USE_WHEELS=${USE_WHEELS}
tty: true
stdin_open: true
volumes:
Expand All @@ -81,6 +93,9 @@ services:
build:
context: .
dockerfile: docker/machine/Dockerfile
target: ${DOCKER_BUILD_ENV}
args:
- USE_WHEELS=${USE_WHEELS}
tty: true
stdin_open: true
volumes:
Expand All @@ -99,6 +114,9 @@ services:
build:
context: .
dockerfile: docker/machine/Dockerfile
target: ${DOCKER_BUILD_ENV}
args:
- USE_WHEELS=${USE_WHEELS}
tty: true
stdin_open: true
volumes:
Expand All @@ -118,6 +136,9 @@ services:
build:
context: .
dockerfile: docker/machine/Dockerfile
target: ${DOCKER_BUILD_ENV}
args:
- USE_WHEELS=${USE_WHEELS}
tty: true
stdin_open: true
volumes:
Expand Down
Loading

0 comments on commit a1dfa57

Please sign in to comment.