Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enhance: split response into chunks smaller than 2000 chars #49

Merged

Conversation

rodonguyen
Copy link
Contributor

@rodonguyen rodonguyen commented Mar 16, 2024

View issue #35

This PR:

  • add split() in llm.py to split a string into smaller chunks less than 2000-char long
  • iterate through the message list to send each element in bot.py

Result:

  • example1
    image
  • example 2:
    image
    image

@rodonguyen rodonguyen changed the title feat: split response into chunks smaller than 2000 chars enhance: split response into chunks smaller than 2000 chars Mar 16, 2024
…d utility function to split message. Keep each chunk to stop at meaningful sentence
…ers' of github.com:bifrostlab/llm-assistant into fix-error-when-llm-response-is-longer-than-2000-characters

Add utility function to crop message into less than 2000 chars chunk. Full sentences are retained
@rodonguyen
Copy link
Contributor Author

rodonguyen commented Mar 17, 2024

@AndrewsTrinh why don't you replace my function with yours?
the project is pretty simple so I don't feel the need to split into another utils file. But up to you tho.

@AndrewsTrinh
Copy link
Contributor

@AndrewsTrinh why don't you replace my function with yours? the project is pretty simple so I don't feel the need to split into another utils file. But up to you tho.
@rodonguyen : I just wanna keep it separated, so that you can review it and approve if it can be replacement to your function. I'm not really familiar with git workflow, just want to keep it there to makesure I'm not messing up something lol. I agree that due to simplity, separated utilities package is not necessary :)

@AndrewsTrinh
Copy link
Contributor

@rodonguyen ready for review!

@rodonguyen rodonguyen merged commit 3a1610b into main Mar 21, 2024
3 checks passed
@rodonguyen rodonguyen deleted the fix-error-when-llm-response-is-longer-than-2000-characters branch March 21, 2024 11:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants