Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add CFG integration to VLLM #788

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

miftahmoha
Copy link
Contributor

@miftahmoha miftahmoha commented Apr 4, 2024

This test failed, but it's related to the llama.cpp integration.

1

Fixes #780.

@miftahmoha miftahmoha marked this pull request as draft April 4, 2024 21:36
@miftahmoha miftahmoha marked this pull request as ready for review April 4, 2024 22:00
@rlouf rlouf force-pushed the add_cfg_integration_to_vllm branch from 17aec4d to 1ef5c73 Compare April 8, 2024 15:52
@rlouf
Copy link
Member

rlouf commented Apr 8, 2024

Fixed the failing test! Could you add a simple test here ?

@rlouf rlouf added structured generation Linked to structured generation vLLM Things involving vLLM support grammar labels Apr 8, 2024
@miftahmoha
Copy link
Contributor Author

@rlouf
Copy link
Member

rlouf commented Apr 11, 2024

Would you mind removing the pytest.mark.xfail decorator? The test is not run currently.

@miftahmoha
Copy link
Contributor Author

@rlouf Done.

@rlouf
Copy link
Member

rlouf commented Apr 12, 2024

Ok I'll update the vLLM install so we can run these tests in CI. I'll edit your PR directly so don't touch anything for now!

@rlouf
Copy link
Member

rlouf commented Apr 12, 2024

I get an error on the CFG test, it generates tokens that are not allowed. We need to determine if it's a problem with the CFGGuide or the integration itself. Can you reproduce this locally?

E               lark.exceptions.UnexpectedCharacters: No terminal matches '.' in the current parser context, at line 1 col 12
E               
E               --2-2-111.9.
E                          ^
E               Expected one of: 
E                       * RPAR
E                       * STAR
E                       * MINUS
E                       * SLASH
E                       * PLUS
E               
E               Previous tokens: Token('NUMBER', '111.9')

../../../.conda/envs/outlines-dev/lib/python3.10/site-packages/lark/lexer.py:598: UnexpectedCharacters

@miftahmoha
Copy link
Contributor Author

@rlouf Using the debugger, it seems that is something related to the parser. I'm going to have a look at that.

@rlouf
Copy link
Member

rlouf commented Apr 13, 2024

That's what I feared

@miftahmoha
Copy link
Contributor Author

@rlouf After inspecting a little bit, generate.cfg first should be fixed since it has a bug.

The def copy(self) -> "CFGGuide" should just return self instead of CFGGuide(self.cfg_string, self.tokenizer), it loses the regex_fsm when copying.

Concerning the error, the problem comes from the built regex from interactive.accepts(). It allows some tokens that interactive.accepts() doesn't ('.' in the error above).

I could reproduce the same type of error using generate.cfg without the VLLM integration.

@miftahmoha
Copy link
Contributor Author

That's what I feared

I'm going to fix it, I'll open a draft PR when the time comes.

@rlouf
Copy link
Member

rlouf commented Apr 27, 2024

Thank you! I'm sorry for lack of responsiveness here, I'm really busy with other things but will have more time soon :)

@miftahmoha
Copy link
Contributor Author

No problem!

@leloykun
Copy link
Contributor

leloykun commented May 8, 2024

@rlouf After inspecting a little bit, generate.cfg first should be fixed since it has a bug.

The def copy(self) -> "CFGGuide" should just return self instead of CFGGuide(self.cfg_string, self.tokenizer), it loses the regex_fsm when copying.

Concerning the error, the problem comes from the built regex from interactive.accepts(). It allows some tokens that interactive.accepts() doesn't ('.' in the error above).

I could reproduce the same type of error using generate.cfg without the VLLM integration.

I have partially debugged the bug in generate.cfg w/ this PR: #865

@leloykun
Copy link
Contributor

leloykun commented May 8, 2024

@miftahmoha @rlouf

It's not a good idea to return self in def copy cuz then we'd end up using the same fsm for all prompts in the batch. See: https://github.com/outlines-dev/outlines/blob/4f8433d8d6633b0780c3a6c27981f9adffbe49f5/outlines/generate/api.py#L189

@leloykun
Copy link
Contributor

leloykun commented May 8, 2024

IMO it's better to add another method, say def deep_copy, which returns CFGGuide(self.cfg_string, self.tokenizer, self.refex_fsm, ...) and use that instead in reorder_fsms (and just in reorder_fsms for now)

Alternatively, we can somehow make CFGGuide pickle-able and just use copy.deepcopy in reorder_fsms

@ekagra-ranjan
Copy link

Concerning the error, the problem comes from the built regex from interactive.accepts(). It allows some tokens that interactive.accepts() doesn't ('.' in the error above).

Hi @miftahmoha , I didnt get you. Could you please expand more on above? Thanks!

@miftahmoha
Copy link
Contributor Author

@ekagra-ranjan Sorry for the late answer, there are two mechanisms that are in the works here.

First, the one that gives the next possible tokens following the CFG (tokens in the context of a lexer in compilers) which is interactive.accepts(). Second, one that takes those tokens and builds a regex out of them for guided generation.

There is a problem in the second mechanism.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
grammar structured generation Linked to structured generation vLLM Things involving vLLM support
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add CFGLogitsProcessor for the vLLM integration
4 participants