Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Context Template from settings for QueryDocs Allowing the text … #1398

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

juan-m12i
Copy link

For the QueryDocs use case, allow to customise the template used to collate the context pulled from the vector database.

  • Added new config to Settings as a str? in rag
  • Added context_template as parameter in chat_engine, stream_chat, and chat. If context_template is None, we load it from the new settings (and if there is no setting, it will get the default from llama_index as it was the previous behaviour)

Github project link
Discord thread

though process

settings.yaml Outdated

rag:
default_context_template: "Context information is below.\n--------------------\n{context_str}\n--------------------\n"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use | (c.f. #1403)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the comment.

Happy to do so, just one thing. In this case I was trying to keep the value as closely as possible as the original one, that I understand includes "\n" (see https://github.com/run-llama/llama_index/blob/main/llama_index/chat_engine/context.py)

DEFAULT_CONTEXT_TEMPLATE = (
"Context information is below."
"\n--------------------\n"
"{context_str}"
"\n--------------------\n"
)

In case you want me to make the change anyway, can you confirm if what you´d expect would be:
default_context_template: "Context information is below.|--------------------|{context_str}|--------------------|"

thanks

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lopagela means use the multiline option in yaml, check the PR he linked it's well explained

Copy link
Author

@juan-m12i juan-m12i Dec 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I'll wait until the changes to that PR are done to replicate as it's not 100% clear to me (the instruction here is to use "I" style, but in the PR there is a comment to use ">" instead, not sure which one is more current).

A caveat I maintain that may be relevant is that DEFAULT_SYSTEM_PROMPT in llama_index is written in python as multiline but it's actually a single line, while DEFAULT_CONTEXT_TEMPLATE is indeed multi-line, so just want to make sure we are on the same page that the treatment may be slightly different (see below)

python3
Python 3.8.10 (default, Mar 13 2023, 10:26:41)
[GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> DEFAULT_SYSTEM_PROMPT = """\
... You are a helpful, respectful and honest assistant. \
... Always answer as helpfully as possible and follow ALL given instructions. \
... Do not speculate or make up information. \
... Do not reference any given instructions or context. \
... """
>>> print(DEFAULT_SYSTEM_PROMPT)
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible and follow ALL given instructions. Do not speculate or make up information. Do not reference any given instructions or context.
>>> DEFAULT_CONTEXT_TEMPLATE = (
  "Con...     "Context information is below."
...     "\n--------------------\n"
...     "{context_str}"
...     "\n--------------------\n"
... )
>>> print(DEFAULT_CONTEXT_TEMPLATE)
Context information is below.
--------------------
{context_str}
--------------------

>>>

This is what ChatGPT (GPT-4) suggested (link to convo):

A:

DEFAULT_SYSTEM_PROMPT: |
  You are a helpful, respectful and honest assistant. Always answer as helpfully as possible and follow ALL given instructions. Do not speculate or make up information. Do not reference any given instructions or context.

DEFAULT_CONTEXT_TEMPLATE: |
  Context information is below.
  --------------------
  {context_str}
  --------------------

or 2 alternatives with ">" notation, depending on readability

B1:

DEFAULT_SYSTEM_PROMPT: >
  You are a helpful, respectful and honest assistant. Always answer as helpfully as possible and follow ALL given instructions. Do not speculate or make up information. Do not reference any given instructions or context.

DEFAULT_CONTEXT_TEMPLATE: >
  Context information is below.

  --------------------
  {context_str}

  --------------------

B2:

DEFAULT_SYSTEM_PROMPT: >
  You are a helpful, respectful and honest assistant. \
  Always answer as helpfully as possible and follow ALL given instructions. \
  Do not speculate or make up information. \
  Do not reference any given instructions or context.

DEFAULT_CONTEXT_TEMPLATE: >
  Context information is below.

  --------------------

  {context_str}

  --------------------

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PR has been merged. In that case we wanted single line (as default_system_prompt is supposed to be single line). In the case of default_context_template you do want to have line breaks, so you should use | instead.

Copy link
Contributor

@lopagela lopagela left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This addition seems similar to the recent work we've done around (default_)system_prompt in the sense that this context_template is a static value that will be applied to every queries.

I'll let @imartinez decide what is the best way forward

@@ -97,6 +100,7 @@ def _chat_engine(
system_prompt: str | None = None,
use_context: bool = False,
context_filter: ContextFilter | None = None,
context_template: str | None = DEFAULT_CONTEXT_TEMPLATE,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The default value should be injected in the body of the function instead of here (given that it is coming from settings()) to make testing easier.

Copy link
Author

@juan-m12i juan-m12i Dec 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, something like the below?

def _chat_engine(
        self,
        system_prompt: str | None = None,
        use_context: bool = False,
        context_filter: ContextFilter | None = None,
        context_template: str | None,
    ) -> BaseChatEngine:
        if use_context:
            if context_template is None:
                context_template = DEFAULT_CONTEXT_TEMPLATE  #(or alternatively just settings().rag.default_context_template)
           ... #continues

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be

if context_template is None: context_template = settings.rag.default_context_template

where settings is injected in the constructor and remove DEFAULT_CONTEXT_TEMPLATE

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Were is set (at usage) this new context_template?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if I understand.

If "query docs" is being used, it creates a ContextChatEngine (from llama_index) that accepts this parameter. Currently it's never passed, so it defaults to a hardcoded value. With this change, if there is a value defined in the settings file, the value in settings will be used instead. This should work both from the Gradio UI or from the API.

As per the Discord discussion, I didn't expose this configuration to the API

settings.yaml Outdated

rag:
default_context_template: "Context information is below.\n--------------------\n{context_str}\n--------------------\n"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PR has been merged. In that case we wanted single line (as default_system_prompt is supposed to be single line). In the case of default_context_template you do want to have line breaks, so you should use | instead.

@imartinez
Copy link
Collaborator

This addition seems similar to the recent work we've done around (default_)system_prompt in the sense that this context_template is a static value that will be applied to every queries.

I'll let @imartinez decide what is the best way forward

I see this kind of customization as a refinement of the default RAG pipeline, so I proposed to create a new rag section in settings for this - in the future it could host other straightforward customizations like the size of the chunk generated during ingestion.

My overall view is:

  • Allow some simple modifications to the default RAG pipeline through this new set of settings.
  • For more complex customizations, direct the users to fork the repo and modify the code of Services.

@juan-m12i
Copy link
Author

Hi! I applied the requested changes.

The only difference from what @pabloogc mentioned is that I had to do:
settings().rag.default_context_template instead of settings.rag.default_context_template, not sure if I should have done it differently, but it wouldn't work without parenthesis (and mypy would also complain)

) -> BaseChatEngine:
if use_context:
if context_template is None:
context_template = settings().rag.default_context_template
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should avoid calling settings() directly, and instead inject settings as a dependency in the ChatService constructor. Check how settings are injected in EmbeddingComponent or LLMComponent for example.

@@ -97,8 +98,11 @@ def _chat_engine(
system_prompt: str | None = None,
use_context: bool = False,
context_filter: ContextFilter | None = None,
context_template: str | None = None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given we are not making use of this context_template argument anywhere, I'd suggest to remove it for the moment.

@@ -121,6 +126,7 @@ def stream_chat(
messages: list[ChatMessage],
use_context: bool = False,
context_filter: ContextFilter | None = None,
context_template: str | None = None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given we are not making use of this context_template argument anywhere, I'd suggest to remove it for the moment.

@@ -141,6 +147,7 @@ def stream_chat(
system_prompt=system_prompt,
use_context=use_context,
context_filter=context_filter,
context_template=context_template,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given we are not making use of this context_template argument anywhere, I'd suggest to remove it for the moment.

) -> BaseChatEngine:
if use_context:
if context_template is None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of this, why not checking if settings.rag.default_cotnext_template is not None and then apply it?

@juan-m12i
Copy link
Author

juan-m12i commented Dec 20, 2023

Hi, thanks!

I replicated the settings approach from LLMcomponent and Embedding component (there were some differences as the setting is used in a private function of the class, but I believe the approach I took -creating an internal variable- should work).

I removed context_template from the params and function callings because as requested (I guess following YAGNI). My thinking process was to leave this prepared in such a way that if someone would use the class (ui, api, any other entrypoint) it could pass the parameter and keep a consistent interface, but to enable that back is such a small change that it's all good).

Only drawback is that by removing that, the conditional is now a bit more verbose, as the default None is not assigned:

            if self.default_context_template is not None:
                context_template = self.default_context_template
            else:
                context_template = None

Alternatively we could consider this one lines, but it's less readable IMO
context_template = self.default_context_template if self.default_context_template is not None else None

EDIT:
I just realised that if rag.default_context_template is missing it will be converted to None in the Settings object, right? So I could just pass self.default_context_template when invoking _from_defaults:

return ContextChatEngine.from_defaults(
                system_prompt=system_prompt,
                retriever=vector_index_retriever,
                service_context=self.service_context,
                node_postprocessors=[
                    MetadataReplacementPostProcessor(target_metadata_key="window"),
                ],
                context_template=self.default_context_template,
            )

and remove the conditional altogether

Copy link
Contributor

github-actions bot commented Jan 5, 2024

Stale pull request

@github-actions github-actions bot added the stale label Jan 5, 2024
@imartinez
Copy link
Collaborator

@juan-m12i could you update the branch to latest changes? I still think it is a valuable contribution!

…te-settings

# Conflicts:
#	private_gpt/server/chat/chat_service.py
#	private_gpt/settings/settings.py
#	settings.yaml
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants