Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update ruby bindings #2154

Merged
merged 3 commits into from
May 22, 2024
Merged

Update ruby bindings #2154

merged 3 commits into from
May 22, 2024

Conversation

taf2
Copy link
Contributor

@taf2 taf2 commented May 15, 2024

Want to update the ruby bindings and push a new gem. I've added a Rakefile and would like to also think about making this more maintainable moving forward, but for today having a new gem is a good thing i think.

@ggerganov
Copy link
Owner

but for today having a new gem is a good thing i think.

Would you like this to be merged as it is, or do you plan to make more updates in this PR?

@taf2
Copy link
Contributor Author

taf2 commented May 15, 2024

As is - I’m hoping to get the new gem pushed to rubygems thanks

@ggerganov
Copy link
Owner

The CI does not pass atm

@ggerganov ggerganov merged commit 22d46b7 into ggerganov:master May 22, 2024
50 checks passed
jiahansu pushed a commit to WiseSync/whisper.cpp that referenced this pull request May 28, 2024
* update library files

* update whispercpp

* not needed for gem
bygreencn added a commit to bygreencn/whisper.cpp that referenced this pull request Aug 9, 2024
* tag 'v1.6.2':
  release : v1.6.2
  Revert "whisper : remove extra backend instance (huh?)" (ggerganov#2182)
  server : fix typo (ggerganov#2181)
  ruby : update bindings (ggerganov#2154)
  release : v1.6.1
  examples : add support for decoding input with ffmpeg (Linux) (ggerganov#2133)
  node : add flash_attn param (ggerganov#2170)
  ci: Update build.yml to suppress warnings about node.js versions (ggerganov#2166)
  release : v1.6.0
  whisper : use flash attention (ggerganov#2152)
  talk-llama : reject runs without required arguments (ggerganov#2153)
  sync : ggml
  metal : support FA without mask + add asserts (llama/7278)
  ggml : add RPC backend (llama/6829)
  rm wait() (llama/7233)
  CUDA: add FP32 FlashAttention vector kernel (llama/7188)
  scripts : sync ggml-rpc
iThalay pushed a commit to iThalay/whisper.cpp that referenced this pull request Sep 23, 2024
* update library files

* update whispercpp

* not needed for gem
iThalay pushed a commit to iThalay/whisper.cpp that referenced this pull request Sep 23, 2024
* update library files

* update whispercpp

* not needed for gem
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants