You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So I recently found your project and really like your approach. I have actually been working on a similar project of my own because based on TypeScript. I started my own project because I wanted something with minimal to zero dependencies, lightweight and focused. The concept is a toolbox of lightweight AI libraries that can be used independently or together to build a complete application. You have several of the pieces I have started building. I am considering contributing to your project instead.
I was thinking about building a model fusion package that offers llama.cpp bindings for Nodejs. It would allow people to use mode fusion with local models without having to run a separate server.
Thoughts?
The text was updated successfully, but these errors were encountered:
@nnance I love the idea! Ideally it would be a new model provider in a separate node module that people can install, so they don't have to go through any compile steps when using modelfusion core. Over time I want to move all providers to standalone modules, but that's after even v1. The internal API for TextGenerationModels and TextStreamingModels is still somewhat in Flux, but you can take a look here and build on that:
@lgrammel totally aligned on separate node module but it would be great if it was part of a modefusion. have you thought of creating GitHub and NPM organization? Moving this to discord.
So I recently found your project and really like your approach. I have actually been working on a similar project of my own because based on TypeScript. I started my own project because I wanted something with minimal to zero dependencies, lightweight and focused. The concept is a toolbox of lightweight AI libraries that can be used independently or together to build a complete application. You have several of the pieces I have started building. I am considering contributing to your project instead.
I was thinking about building a model fusion package that offers llama.cpp bindings for Nodejs. It would allow people to use mode fusion with local models without having to run a separate server.
Thoughts?
The text was updated successfully, but these errors were encountered: