Skip to content

Support gpustack #142

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

Conversation

graysonchen
Copy link

@graysonchen graysonchen commented Apr 29, 2025

Add GPUStack Provider Support

Description

This PR adds support for the GPUStack API as a new provider in RubyLLM. GPUStack is integrated as an OpenAI-compatible provider with specific adaptations for its API requirements.

Changes

  • Added GPUStack provider module with necessary configuration settings
  • Implemented model listing capability with appropriate mappings for GPUStack's model format
  • Added provider-specific chat functionality
  • Registered GPUStack as an official provider in the RubyLLM ecosystem
  • Updated configuration to include GPUStack API base URL and API key settings

How to use

Configure your API credentials in your application:

RubyLLM.configure do |config|
  config.gpustack_api_base = 'https://your-gpustack-endpoint/v1'
  config.gpustack_api_key = 'your-gpustack-api-key'
  config.default_model = "your-gpustack-model-name"
end

Then use it like any other provider:

RubyLLM.models.refresh!

chat = RubyLLM.chat
chat.ask("What can you do?")

Testing

The implementation has been tested with the GPUStack API to ensure compatibility and proper functionality.

image

@graysonchen graysonchen changed the title add gpustack Support gpustack Apr 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant