-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Update v3 #150
[WIP] Update v3 #150
Conversation
Optimize and refactor mle/cli.py, mle/model.py, and mle/agents/coder.py mle/cli.py: - Move check_config function to the top of the file for better organization - Use CONFIG_FILE constant in check_config for consistency - Set SEARCH_API_KEY environment variable immediately after user input in 'new' command - Use a variable for config file path in 'new' command for better readability mle/model.py: - Add type hints for better code readability and maintainability - Move dependency import logic to separate methods in each model class - Simplify the load_model function - Improve error messages for missing dependencies mle/agents/coder.py: - Add type hints for better code readability and maintainability - Move system prompt generation to a separate method for better organization - Simplify the process_summary function using f-strings - Use more consistent naming for variables and methods - Improve error handling and type checking These changes maintain the current structure and functionality while improving code readability, maintainability, and slightly enhancing performance.
Looks good to me! @U-C4N Thank you for the contribution! |
@U-C4N Hi, could you add a [MRG] or [WIP] in your title? so we can know if this PR is ready to be reviewed thanks a lot for the contribution. |
Hi, @U-C4N. Thank you very much for all your contributions! This pull request significantly improves our code quality. Before we merge it, please ensure the following two steps are completed:
Thanks again! |
`import os MODEL_OLLAMA = 'Ollama' class Model(ABC):
class OllamaModel(Model):
class OpenAIModel(Model):
def load_model(project_dir: str, model_name: str) -> Optional[Model]: |
Should be work |
is this PR now [MRG]? |
No , leeizhang didn't test yet last model.py |
Hi, @U-C4N. Many thanks for your updates! Here are some suggestions for the current code:
Thanks again! |
|
||
from mle.function import get_function, process_function_name | ||
from typing import List, Dict, Any, Optional | ||
from mle.utils import process_function_name # Adjust the import path as needed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you may remove the comments here :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
from mle.utils import process_function_name
-> from mle.function import process_function_name
from mle.function import get_function, process_function_name | ||
from typing import List, Dict, Any, Optional | ||
from mle.utils import process_function_name # Adjust the import path as needed | ||
import anthropic |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it is suggested to import anthropic
module in the python runtime. you may refer to OpenAIModel, which loads the openai
package by importlib
.
https://github.com/MLSysOps/MLE-agent/blob/main/mle/model.py#L90
@@ -155,18 +112,39 @@ def stream(self, chat_history, **kwargs): | |||
else: | |||
yield delta.content | |||
|
|||
class AnthropicModel(Model): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you may add this model into the cli.py
https://github.com/MLSysOps/MLE-agent/blob/main/mle/cli.py#L113
with open(os.path.join(project_dir, 'project.yml'), 'r') as file: | ||
data = yaml.safe_load(file) | ||
if data['platform'] == MODEL_OPENAI: | ||
return OpenAIModel(api_key=data['api_key'], model=model_name) | ||
if data['platform'] == MODEL_OLLAMA: | ||
return OllamaModel(model=model_name) | ||
if data['platform'] == 'Anthropic': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MODEL_CLAUDE = 'Anthropic'
if data['platform'] == MODEL_CLAUDE:
...
model (str): The model with version. | ||
temperature (float): The temperature value. | ||
""" | ||
def __init__(self, api_key: str, model: str = 'gpt-3.5-turbo', temperature: float = 0.7): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Traceback (most recent call last):
File "/root/miniconda3/envs/mle/bin/mle", line 33, in
sys.exit(load_entry_point('mle-agent', 'console_scripts', 'mle')())
File "/root/miniconda3/envs/mle/lib/python3.10/site-packages/click/core.py", line 1157, in call
return self.main(*args, **kwargs)
File "/root/miniconda3/envs/mle/lib/python3.10/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/root/miniconda3/envs/mle/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/root/miniconda3/envs/mle/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/root/miniconda3/envs/mle/lib/python3.10/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/root/workspace/MLE-agent/mle/cli.py", line 68, in start
return baseline(os.getcwd(), model)
File "/root/workspace/MLE-agent/mle/workflow/baseline.py", line 67, in baseline
advisor_report = advisor.interact("[green]User Requirement:[/green] " + ml_requirement + "\n" + ask_data(dataset))
File "/root/workspace/MLE-agent/mle/agents/advisor.py", line 131, in interact
self.report = self.suggest(requirement)
File "/root/workspace/MLE-agent/mle/agents/advisor.py", line 113, in suggest
text = self.model.query(
File "/root/workspace/MLE-agent/mle/model.py", line 85, in query
result = get_function(function_name)(**arguments)
NameError: name 'get_function' is not defined
Optimize and refactor mle/cli.py, mle/model.py, and mle/agents/coder.py
mle/cli.py:
mle/model.py:
mle/agents/coder.py:
These changes maintain the current structure and functionality while improving code readability, maintainability, and slightly enhancing performance.
Closes #
What has been done to verify that this works as intended?
Why is this the best possible solution? Were any other approaches considered?
How does this change affect users? Describe intentional changes to behavior and behavior that could have accidentally been affected by code changes. In other words, what are the regression risks?
Do we need any specific form for testing your changes? If so, please attach one.
Does this change require updates to documentation? If so, please file an issue here and include the link below.
Before submitting this PR, please make sure you have:
the credit file.