-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FileNotFoundError #635
Comments
same, can't figure it out |
It's due to this. |
same error |
I have the same issue. |
Same! |
Is there any alternative? What changes needs to be done in the source code: (interpreter/code_interpreters/languages/python.py) |
I have found a solution. Change the above code in "\code_interpreters\languages\python.py" as follows
|
Update: Resolution PR: #643 It's due to quoting By replacing this line in code_interpreters/languages/python.py: With:
The issue is resolved on windows systems. Error reproduced output:
Output after implementing above solution:
Regression test of changes on linux:
|
Resolves issue: OpenInterpreter#635
I referenced this for a couple of other bug callouts that appear to be related. This fix appears to work for my use case (but same error dump) as well. |
* Fixed a bug in setup_text_llm * chore: update test suite This adds a system_message prepending test, a .reset() test, and makes the math testing a little more robust, while also trying to prevent some edge cases where the llm would respond with explanations or an affirmative 'Sure I can do that. Here's the result...' or similar responses instead of just the exepcted result. * New documentation site: https://docs.openinterpreter.com/ * feat: add %tokens magic command that counts tokens via tiktoken * feat: add estimated cost from litellm to token counter * fix: add note about only including current messages * chore: add %tokens to README * fix: include generated code in token count; round to 6 decimals * Put quotes around sys.executable (bug fix) * Added powershell language * Adding Mistral support * Removed /archive, adding Mistral support * Removed /archive, adding Mistral support * First version of ooba-powered setup_local_text_llm * First version of ooba-powered setup_local_text_llm * Second version of ooba-powered setup_local_text_llm * Testing tests * More flexible tests * Paused math test Let's look into this soon. Failing a lot * Improved tests * feat: add support for loading different config.yaml files This adds a --config_file option that allows users to specify a path to a config file or the name of a config file in their Open Interpreter config directory and use that config file when invoking interpreter. It also adds similar functionality to the --config parameter allowing users to open and edit different config files. To simplify finding and loading files I also added a utility to return the path to a directory in the Open Interpreter config directory and moved some other points in the code from using a manually constructed path to utilizing the same utility method for consistency and simplicity. * feat: add optional prompt token/cost estimate to %tokens This gives and optional argument that will estimate the tokens and cost of any provided prompt to allow users to consider the implications of what they are going to send before it has an impact on their token usage. * Paused math test * Switched tests to turbo * More Ooba * Using Eric's tests * The Local Update * Alignment * Alignment * Fixed shell blocks not ending on error bug * Added useful flags to generator * Fixed Mistral HTML entities + backticks problem * Fixed Mistral HTML entities + backticks problem * OpenAI messages -> text LLMs are now non-function-calling * OpenAI messages -> text LLMs are now non-function-calling * Better messaging * Incremented version, updated litellm * Skipping nested test * Exposed Procedures * Exposed get_relevant_procedures_string * Better procedures exposure * Better procedures exposure * Exits properly in colab * Better exposed procedures * Better exposed procedures * More powerful reset function, incremented version * WELCOME HACKERS! The Open Interpreter Hackathon is on. * Welcome hackers! * Fix typo in setup_text_llm.py recieve -> receive * Welcome hackers! * The OI hackathon has wrapped! Thank you everyone! * THE HACKATHON IS ON * ● The Open Interpreter Hackathon has been extended! * Join the hackathon! https://lablab.ai/event/open-interpreter-hackathon * Thank you hackathon participants! * Fix "depracated" typo * Update python.py Resolves issue: OpenInterpreter#635 * Update python.py More robust handling. * Fix indentation in language_map.py * Made semgrep optional, updated packages, pinned LiteLLM * Fixed end_of_message and end_of_code flags * Add container timeout for easier server integration of OI. controllable via env var 'OI_CONTAINER_TIMEOUT'. defaults to no timeout. Also add type safety to core/core.py * Update things, resolve merge conflicts. * fixed the tests, since they imported and assumed that was a instance, but is wasnt. now uses interpreter.create_interpreter() --------- Co-authored-by: Kyle Huang <kyle@anyscale.com> Co-authored-by: Eric allen <eric@ericrallen.dev> Co-authored-by: killian <63927363+KillianLucas@users.noreply.github.com> Co-authored-by: DaveChini <Dave@ctc.com> Co-authored-by: Ikko Eltociear Ashimine <eltociear@gmail.com> Co-authored-by: Jamie Dubs <jamie@jamiedubs.com> Co-authored-by: Leif Taylor <leiftaylor@gmail.com> Co-authored-by: chenpeng08 <chenpeng08@baidu.com>
Resolves issue: OpenInterpreter#635 Former-commit-id: 50aede2 Former-commit-id: 245f3d6d3b25179c6e917e7c6980db7e9ea4f7a4 Former-commit-id: e5e891dca969a2fc469fe7d25fa93569011aa244 [formerly 4906a3f65a981237f906c235f7b12a6ec5822cf1] Former-commit-id: f64a78adac718fe983898018a22c811fe17c9067
Resolves issue: OpenInterpreter#635
Late to the party but I'm running into the same issue and the solution above did not work :( Here are my details: Python version: Python 3.11.5
OI version:
LIne 24 - 28 of C:\Users\Pieter\AppData\Roaming\Python\Python311\site-packages\interpreter\core\code_interpreters\languages\python.py:
My setup: Open Interpreter connection to LM Studio v0.2.9 using model config:
GPU offload: 25 layers |
@pieterhouwen Can we get the error message you're getting? Need a little bit more error output/stacktrace in order to diagnose root cause. |
@leiftaylorcb Here is my complete stacktrace:
Is running interpreter with python3.10 as easy as moving the python3.10 executable path to the top in the PATH variable? |
Will need to figure out the start_cmd that is bring passed in, can you modify the interpreter/core/code_interpreters/subprocess_code_interpreter.py line 48 to add a print command: Here:
|
Thank you for your response. Using the print statement you provided I saw that OI is trying to run Is this something that OI should pick up on and not blame Python? |
The issue appears to be coming from interpreter/core/respond.py the This respond.py, we determine what interpreter (e.g. Python or Javascript or HTML) from the interepter.messages[-1]["language"] key. For whatever reason, when it's trying to give you a response, it seems to think that the language you want is javascript, so it's instantiating the Javascript SubprocessCodeInterpreter (core/code_interpreters/languages/javascript.py Javascript()) @pieterhouwen Are you getting this error after you make your first prompt, or is this error happening the moment you start open-interpreter? What question are you asking it? One thing that could potentially get you further, is to install node js. e.g. node 20.10.x |
@leiftaylorcb Thanks for the explanation! Yes I am indeed trying to make it create Javascript. For a website I am building I am asking it to write a function which will display a form and based on that input show a loader icon and then some text. The html part (which it does first) works fine but indeed as soon as I want to check the JS code it crashes. I installed node and it seems to be working now |
Excellent! Thanks for your prompt responses. It was very helpful for me being able to solve your problem! |
@KillianLucas We should probably add exception handling for if the self.run_cmd, e.g. |
Describe the bug
Traceback (most recent call last):
File "D:\Python311\Lib\site-packages\interpreter\code_interpreters\subprocess_code_interpreter.py", line 65, in run
self.start_process()
File "D:\Python311\Lib\site-packages\interpreter\code_interpreters\subprocess_code_interpreter.py", line 43, in start_process
self.process = subprocess.Popen(self.start_cmd.split(),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python311\Lib\subprocess.py", line 1026, in init
self._execute_child(args, executable, preexec_fn, close_fds,
File "D:\Python311\Lib\subprocess.py", line 1538, in _execute_child
hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [WinError 2] The system cannot find the file specified
Reproduce
After updating the interpreter this happend and no python can be executed.
Expected behavior
While attempting to run a Python command
Screenshots
No response
Open Interpreter version
0.1.9
Python version
3.11.5
Operating System name and version
Microsoft Windows 11 Home
Additional context
No response
The text was updated successfully, but these errors were encountered: