-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Good First Issue]: Verify tiny-llama-1b-chat with GenAI text_generation #260
Comments
.take |
Hello @Vishwa44, the ticket will be manually assigned to you shortly - the take feature hasn't been introduced to GenAI repo yet, but it will be today. :) Thanks for taking a look at the issue! |
@Wovchena I was able to convert, and run the tiny-llama-1b-chat with greedy_causal_lm.cpp and beam_search_causal_lm.cpp, and generate resonable outputs. |
Yes, its already there. Thank you |
.take |
Thanks for being interested in this issue. It looks like this ticket is already assigned to a contributor. Please communicate with the assigned contributor to confirm the status of the issue. |
Context
This task regards enabling tests for tiny-llama-1b-chat. You can find more details under openvino_notebooks LLM chatbot README.md.
Please ask general questions in the main issue at #259
What needs to be done?
Described in the main Discussion issue at: #259
Example Pull Requests
Described in the main Discussion issue at: #259
Resources
Contact points
Described in the main Discussion issue at: #259
Ticket
The text was updated successfully, but these errors were encountered: