-
Notifications
You must be signed in to change notification settings - Fork 156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
First pass at integrating xP3 #60
base: main
Are you sure you want to change the base?
Conversation
viraat
commented
Apr 14, 2023
- Add scripts to generate task constants for xP3 datasets
- Get started on a task_config generator for xP3:
- integrates P3 config generator with existing hugging face loader from c4ai loading script: https://github.com/for-ai/benchmarking-multilingual-model/blob/neel_xp3/xp3_folder/tasks.py
- Add scripts to generate task constants for xP3 datasets - Get started on a task_config generator for xP3: - integrates P3 config generator with existing hugging face loader
Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). View this failed invocation of the CLA check for more information. For the most up to date status, view the checks section at the bottom of the pull request. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added some comments to help @shayne-longpre understand where to pay attention.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file is in progress atm. It's going to need a few more tweaks to get going.
I went with the same approach at P3 (T0) as it seemed to make the most sense to me. If this is the wrong path to go down, happy to discuss.
I separated the file out for hackability. It can go back into one of the task_config
files later.
@@ -0,0 +1,11 @@ | |||
"""Constants relate to xP3""" | |||
|
|||
XP3_TRAIN_TASKS_SPLIT = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is incomplete, I have the script running but it's slow
flan/v2/task_configs_xp3.py
Outdated
continue | ||
# Do not process T0 variants with negative examples. | ||
# We still keep the variants of these sets in a different format. | ||
# if "_score_eval" in subtask_id: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right now, I've only generated the splits for the training datasets.
flan/v2/task_configs_xp3.py
Outdated
# We still keep the variants of these sets in a different format. | ||
# if "_score_eval" in subtask_id: | ||
# continue | ||
# elif constants_t0.T0_TRAIN_TASK_METADATA[task_name][ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wasn't sure how the metadata for the task was filled, this is something I'll need help with.
flan/v2/task_configs_xp3.py
Outdated
task_name=subtask_id, task_source="P3") | ||
XP3_TASK_CONFIGS[task_name] = TaskConfig( | ||
source=seqio.TfdsDataSource( | ||
tfds_name=f"huggingface:{ds_name}/{subset_name}", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From an eye test I think this works: https://www.tensorflow.org/datasets/community_catalog/huggingface seems to have the datasets I checked randomly.
There's some custom version picking in xP3 that will need to be brought over. I plan to put that into the metadata and use it here.
flan/v2/task_configs_xp3.py
Outdated
# elif constants_t0.T0_TRAIN_TASK_METADATA[task_name][ | ||
# "task_type"] == "t0_question_answer": | ||
# preprocessors = [functools.partial(prep.t0, multiple_choice=False)] | ||
# if constants_t0.T0_TRAIN_TASK_METADATA[task_name]["seq_len"]["max"] == 1: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Plan to use something similar and add to the metadata to do the custom preprocessing that happens in xP3. example fn
# pip install -q datasets | ||
import datasets | ||
# git clone -b tr13 https://github.com/Muennighoff/promptsource.git && cd promptsource; pip install -e . | ||
from promptsource.templates import DatasetTemplates |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd be happy to get rid of using promptsource here. It's just used to get all the possible prompt templates for a given dataset. I'm assuming FC has that builtin already, but wasn't sure where to look.
- Add some try catch to log those datasets, and put dummy values