Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Inconsistency between successful pip installation and inability to locate the installed package #1616

Closed
2 tasks done
CmetankaJDD opened this issue May 7, 2024 · 10 comments
Labels
bug Something isn't working severity:medium Affecting multiple users

Comments

@CmetankaJDD
Copy link

Is there an existing issue for the same bug?

Describe the bug

There is an inconsistency between the completion of the pip installation and the inability to locate the installed package thereafter. The pip installation process finishes successfully, indicating that the library should be installed, but subsequent attempts to use or import the library fail due to it not being recognized.

Current Version

ghcr.io/opendevin/opendevin:0.5

Installation and Configuration

docker run \
    --pull=always \
    -e SANDBOX_USER_ID=$(id -u) \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    ghcr.io/opendevin/opendevin:0.5

Model and Agent

-groq/llama3-8b-8192 -CodeActAgent

Reproduction Steps

I am requesting an example of a Discord bot from OpenDevin. installation process was successful. However, after the installation, I cannot use the installed library.

Logs, Errors, Screenshots, and Additional Context

Снимок экрана 2024-05-07 в 08 17 23 Снимок экрана 2024-05-07 в 08 18 42 Снимок экрана 2024-05-07 в 08 18 55
@CmetankaJDD CmetankaJDD added the bug Something isn't working label May 7, 2024
@SmartManoj
Copy link
Contributor

Could you check the latest version?

@CmetankaJDD
Copy link
Author

Could you check the latest version?

Снимок экрана 2024-05-12 в 09 13 21

==============
STEP 6

06:10:41 - opendevin:ERROR: agent_controller.py:147 - Error in loop
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 427, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 386, in completion
response = openai_client.chat.completions.create(**data, timeout=timeout) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 590, in create
return self._post(
^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request
return self._request(
^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1005, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Please reduce the length of the messages or completion.', 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1053, in completion
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1026, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 433, in completion
raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError: Error code: 400 - {'error': {'message': 'Please reduce the length of the messages or completion.', 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/app/opendevin/controller/agent_controller.py", line 142, in _run
finished = await self.step(i)
^^^^^^^^^^^^^^^^^^
File "/app/opendevin/controller/agent_controller.py", line 256, in step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/agenthub/codeact_agent/codeact_agent.py", line 223, in step
response = self.llm.completion(
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 330, in wrapped_f
return self(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 467, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 368, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 390, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 470, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/app/opendevin/llm/llm.py", line 188, in wrapper
resp = completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 3222, in wrapper
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 3116, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2226, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 9233, in exception_type
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 8000, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: GroqException - Error code: 400 - {'error': {'message': 'Please reduce the length of the messages or completion.', 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
Model: llama3-8b-8192
Messages: [{'role': 'system', 'content': 'A chat between a curious user and an artificial intelligence assista
06:10:41 - opendevin:INFO: agent_controller.py:190 - Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.ERROR

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Provider List: https://docs.litellm.ai/docs/providers

INFO: 192.168.65.1:50167 - "GET /api/select-file?file=main.py HTTP/1.1" 200 OK

@neubig neubig added the severity:medium Affecting multiple users label May 13, 2024
@SmartManoj
Copy link
Contributor

SmartManoj commented May 15, 2024

LLM should restart the kernel or Agent should tell LLM to use %pip magic.

image

--

@steventangbc
Copy link

Matplotlib installation still fails after restarting the kernel and using %pip magic.

image

@SmartManoj
Copy link
Contributor

SmartManoj commented May 15, 2024

output of the kernel restart?

@steventangbc
Copy link

"Matplotlib is not installed."

image

@SmartManoj
Copy link
Contributor

The kernel is not restarted. If so, it will show like this.
image

@steventangbc
Copy link

Sometimes it succeeds but other times it fails. I can't seem to reproduce the bug consistently.

image

@SmartManoj
Copy link
Contributor

Could you test that PR?

@mamoodi
Copy link
Collaborator

mamoodi commented May 29, 2024

Closing this issue as PR has merged. Please reopen if this did not resolve the issue.

@mamoodi mamoodi closed this as completed May 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working severity:medium Affecting multiple users
Projects
None yet
Development

No branches or pull requests

5 participants