Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] evals['qa_evaluator'].label is None is not the opposite of evals['qa_evaluator'].label is not None #4017

Closed
firetix opened this issue Jul 25, 2024 · 1 comment · Fixed by #4066
Assignees
Labels
bug Something isn't working c/evals

Comments

@firetix
Copy link

firetix commented Jul 25, 2024

Describe the bug
I'm unable to pull the spans that never been evaluated from Phoenix. In the UI or python if I filter by evals['qa_evaluator'].label is not None it will return the one that are correctly evaluated, but if I do evals['qa_evaluator'].label is None it doesn't return the spans that are not evaluated. How can I retrieve the spans that have never been evaluated, I currently can only retrieve the one that have been evaluated, I could do a script filtering but that won't be optimal.

Thanks for looking at this

To Reproduce
Steps to reproduce the behavior:

  1. Add a few examples of QAConversation chain
  2. Log evaluation for some of them
  3. Try to filter by evals['qa_evaluator'].label is None
    px.Client().get_spans_dataframe("evals['qa_evaluator'].label is None ")

Expected behavior
Return the spans that never been evaluated.

Screenshots
image
image
image

@firetix firetix added bug Something isn't working triage issues that need triage labels Jul 25, 2024
@dosubot dosubot bot added the c/evals label Jul 25, 2024
@mikeldking mikeldking removed the triage issues that need triage label Jul 26, 2024
@mikeldking
Copy link
Contributor

Thanks for the issue @firetix ! Totally understand this usecase and we will address this. Appreciate you bringing it to our attention.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working c/evals
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

3 participants