Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: Add instance check before casting ChatGeneration object #1607

Merged
merged 1 commit into from
Nov 1, 2024

Conversation

Youngrok123
Copy link
Contributor

@Youngrok123 Youngrok123 commented Oct 31, 2024

My case

  • When receiving responses from headline and summary extractors, etc., they are delivered as a Generation object, not ChatGeneration.
{
  "generations": [
    [
      {
        "text": "{\"text\": \"ABCD\"}",
        "generation_info": null,
        "type": "Generation"
      }
    ]
  ],
  "llm_output": null,
  "run": null,
  "type": "LLMResult"
}

Occurrence message

  • unable to apply transformation: 'Generation' object has no attribute 'message'

How to fix

  • Add code to first check if it is an instance of that object before casting to a ChatGeneration object.

Occurrence message
- unable to apply transformation: 'Generation' object has no attribute 'message'

How to fix
- Add code to first check if it is an instance of that object before casting to a ChatGeneration object.
@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Oct 31, 2024
@jjmachan
Copy link
Member

jjmachan commented Nov 1, 2024

thanks again @Youngrok123 🙂 ❤️
btw which model are you using?

@jjmachan jjmachan merged commit 1abf050 into explodinggradients:main Nov 1, 2024
16 checks passed
@Youngrok123
Copy link
Contributor Author

thanks, @jjmachan
I am using 'neuralmagic/Meta-Llama-3.1-70B-Instruct-FP8' model on vLLM.
And as shown in the blog below, it inherits and implements the Langchain LLM class to communicate with the LLM.
LangChain: How to Set a Custom LLM Wrapper

If an exception occurs when generating a testset, the process will stop or normal results will not be produced, so handling the exception is necessary.

@bay-mini
Copy link

bay-mini commented Nov 5, 2024

same problem

@jjmachan
Copy link
Member

jjmachan commented Nov 7, 2024

@bay-mini @Youngrok123 released with v0.2.4 🙂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:XS This PR changes 0-9 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants