Skip to content

Conversation

@simon-mo
Copy link
Collaborator

@simon-mo simon-mo commented Mar 5, 2024

Closes #3148
Closes #1229

@simon-mo simon-mo changed the title Support jarbitrary json_object in OpenAI Support arbitrary json_object in OpenAI Mar 5, 2024

def __init__(self, regex_string: str, tokenizer):
"""Compile the FSM that drives the regex-structured generation.
def adapt_tokenizer(self, tokenizer):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@simon-mo simon-mo changed the title Support arbitrary json_object in OpenAI Support arbitrary json_object in OpenAI and Context Free Grammar Mar 5, 2024
Copy link
Member

@esmeetu esmeetu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this nice feature! Left some comments.

JSON = "json"
REGEX = "regex"
CHOICE = "choice"
CFG = "cfg"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A dumb question: does CFG mean config? only for grammar config?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Context free grammar, good point that I should probably change it to grammar to disambiguate


class CFGLogitsProcessor(BaseLogitsProcessor):

def __init__(self, cfg: str, tokenizer):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding type definition for tokenizer in this whole file is better.

@simon-mo simon-mo requested a review from esmeetu March 16, 2024 05:19
tokenizer.decode = change_decoder(tokenizer.decode)
setattr(tokenizer, "_outlines_adapted", True) # noqa: B010

return tokenizer
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need return this?

@viktor-ferenczi
Copy link
Contributor

Excellent new feature! Thanks everyone!

I will definitely test it when the next vLLM release comes out.

@ksjadeja
Copy link

Is this available in the new vLLM library? How do I use this feature with Mixtral8x7b?

@simon-mo
Copy link
Collaborator Author

This is available in latest release. https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html

@baughmann
Copy link

Please keep this issue closed, but just had to say: Oh man I love vLLM much, you guys are all awesome. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support response_format: json_object in OpenAI server Support for grammar

5 participants