Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to install agency-swarm and fastapi because these package versions have conflicting dependencies #21

Open
codermrrob opened this issue May 18, 2024 · 7 comments

Comments

@codermrrob
Copy link

codermrrob commented May 18, 2024

In a new python env, installing the following packages:

2024-05-18 08:43:59.326 [info] ERROR: Cannot install agency-swarm and fastapi because these package versions have conflicting dependencies.
2024-05-18 08:43:59.326 [info] 
The conflict is caused by:
    instructor 0.6.7 depends on typer<0.10.0 and >=0.9.0
    fastapi-cli 0.0.3 depends on typer>=0.12.3
    instructor 0.6.7 depends on typer<0.10.0 and >=0.9.0
    fastapi-cli 0.0.2 depends on typer>=0.12.3

I receive this during dependency install with pip. Possibly my inexperience, but any pointers would be greatly appreciated.

@THOSSYN
Copy link

THOSSYN commented May 20, 2024

I am also facing the same issue that was reported above.

ERROR: Cannot install agency-swarm and fastapi because these package versions have conflicting dependencies.

The conflict is caused by:
instructor 0.6.7 depends on typer<0.10.0 and >=0.9.0
fastapi-cli 0.0.4 depends on typer>=0.12.3
instructor 0.6.7 depends on typer<0.10.0 and >=0.9.0
fastapi-cli 0.0.3 depends on typer>=0.12.3
instructor 0.6.7 depends on typer<0.10.0 and >=0.9.0
fastapi-cli 0.0.2 depends on typer>=0.12.3

To fix this you could try to:

  1. loosen the range of package versions you've specified
  2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

Please help find a way to resolve this

@javadrip
Copy link

Also facing the same issue, not sure what to do about it.

@egulatee
Copy link

egulatee commented May 24, 2024

I was able to move past the issue by changing requirements.txt to remove dependency on version 0.1.7 on agency-swarm. However can't speak to whether it actually works since I am having issues a bit further. (Run API step)

requirements.txt

agency-swarm
gradio
litellm[proxy]

VRSEN's docs do indicate it's probably not going to work.

To integrate open-source models with this framework, install the previous version of agency-swarm as most projects are not yet compatible with streaming and Assistants V2.

@codermrrob
Copy link
Author

To integrate open-source models with this framework, install the previous version of agency-swarm as most projects are not yet compatible with streaming and Assistants V2.

Yes, this has been what has stopped me.

@josedandrade
Copy link

Ready to try. What could go wrong? I removed agency-swarm version.

From

agency-swarm==0.1.7
gradio
litellm[proxy]

To

agency-swarm
gradio
litellm[proxy]

No errors after that.

@codermrrob
Copy link
Author

@josedandrade the only mention was that streaming would not work if you try and use it. Would like to hear if you have continued to have success with latest agency-swarm lib with open source?

@codermrrob
Copy link
Author

codermrrob commented Jun 10, 2024

I have managed to get this installed and working with groq/llama3-70b-8192 but could not get it to work with ollama locally.

To get around the conflict in the OpenSourceSwarm directory I separately ran pip install instructor and then after ran the pip install -r requirements.txt. There were some errors which I just ignored. There is a thread in discord on the issues-and-discussions channel callled Conflict on install where I found this advice.

Then followed instructions in the video & the readme for this repo. Tried several times unsuccessfully to get it running with ollama/llama3-8b. I could see my prompts running from gradio to the model and the response from the model in my python terminal but it would fail trying to deliver the response to gradio,

When I configured to work with groq/llama3-70b-8192 it worked first time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants