Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: reference link correctness evaluation prompt template #1771

Merged
merged 17 commits into from
Nov 21, 2023

Conversation

axiomofjoy
Copy link
Contributor

@axiomofjoy axiomofjoy commented Nov 17, 2023

Introduces evaluation prompt templates for reference link correctness, which predict whether a reference document (typically retrieved via URL) is the correct document for a particular query.

Copied from #1743

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@axiomofjoy axiomofjoy changed the title reference link evals feat: reference link correct evaluation prompt template Nov 17, 2023
@axiomofjoy axiomofjoy marked this pull request as ready for review November 20, 2023 23:05
@axiomofjoy axiomofjoy changed the title feat: reference link correct evaluation prompt template feat: reference link correctness evaluation prompt template Nov 20, 2023
@axiomofjoy axiomofjoy merged commit bf731df into main Nov 21, 2023
8 checks passed
@axiomofjoy axiomofjoy deleted the ref-link-eval branch November 21, 2023 23:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

2 participants