Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Real-life use #4

Open
StrangeTcy opened this issue Jul 14, 2023 · 1 comment
Open

Real-life use #4

StrangeTcy opened this issue Jul 14, 2023 · 1 comment

Comments

@StrangeTcy
Copy link

I've tested this approach on a single-language (English) LlaMA, and it worked great, except:

  1. it didn't get the LinkedIn layoff answer right
  2. it didn't output any spaces between words

But the thing that I wonder about is real-life use: when you address a question to an LLM, you don't normally provide the context as well.
Is there a way to provide it anyway?
Also, is there any specific finetuning procedure that'd make the model better at using this approach?

@bojone
Copy link
Owner

bojone commented Aug 10, 2023

You can use the nearby text as a query and divide the distant text into multiple shorter contexts through a sliding window.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants