Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: Can we add the Support of OpenAI compatible API #1771

Closed
1 task done
y100143239 opened this issue Aug 1, 2024 · 0 comments
Closed
1 task done
Assignees
Labels

Comments

@y100143239
Copy link

Is there an existing issue for the same feature request?

  • I have checked the existing issues.

Is your feature request related to a problem?

No response

Describe the feature you'd like

Can we add the Support of OpenAI compatible API for local deployed LLM or VLM service, just similar as the function that Dify provided? I think this will be a generally used basic function for a lot internal network enterprise users.

Describe implementation you've considered

No response

Documentation, adoption, use case

No response

Additional information

No response

@KevinHuSh KevinHuSh mentioned this issue Aug 6, 2024
27 tasks
KevinHuSh pushed a commit that referenced this issue Aug 6, 2024
### What problem does this PR solve?

#1771  add supprot for OpenAI-API-Compatible 

### Type of change

- [x] New Feature (non-breaking change which adds functionality)

---------

Co-authored-by: Zhedong Cen <cenzhedong2@126.com>
Halfknow pushed a commit to Halfknow/ragflow that referenced this issue Nov 11, 2024
### What problem does this PR solve?

infiniflow#1771  add supprot for OpenAI-API-Compatible 

### Type of change

- [x] New Feature (non-breaking change which adds functionality)

---------

Co-authored-by: Zhedong Cen <cenzhedong2@126.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants