Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error loading corpus and train data #242

Open
MohammedAlhajji opened this issue Aug 19, 2024 · 0 comments
Open

error loading corpus and train data #242

MohammedAlhajji opened this issue Aug 19, 2024 · 0 comments

Comments

@MohammedAlhajji
Copy link

MohammedAlhajji commented Aug 19, 2024

I'm passing triplets to the trainer. After preparing the data, when I try to train I am facing an error while reading the files generated by trainer.prepare_training_data. My corpus and queries files looks good, one text per line. My triplet looks good, a list of three numbers per line. All looks good so far. When I try to train, I get the following error.

my coed:

trainer.prepare_training_data(train_examples, mine_hard_negatives=False)
trainer.train(batch_size=32)

Error:

[Aug 19, 09:49:51] #> Loading the queries from data/queries.train.colbert.tsv ...
Process Process-9:
Traceback (most recent call last):
  File "/home/usr/miniconda/envs/colbert2/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/usr/miniconda/envs/colbert2/lib/python3.11/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/usr/miniconda/envs/colbert2/lib/python3.11/site-packages/colbert/infra/launcher.py", line 134, in setup_new_process
    return_val = callee(config, *args)
                 ^^^^^^^^^^^^^^^^^^^^^
  File "/home/usr/miniconda/envs/colbert2/lib/python3.11/site-packages/colbert/training/training.py", line 43, in train
    reader = LazyBatcher(config, triples, queries, collection, (0 if config.rank == -1 else config.rank), config.nranks)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/usr/miniconda/envs/colbert2/lib/python3.11/site-packages/colbert/training/lazy_batcher.py", line 28, in __init__
    self.queries = Queries.cast(queries)
                   ^^^^^^^^^^^^^^^^^^^^^
  File "/home/usr/miniconda/envs/colbert2/lib/python3.11/site-packages/colbert/data/queries.py", line 113, in cast
    return cls(path=obj)
           ^^^^^^^^^^^^^
  File "/home/usr/miniconda/envs/colbert2/lib/python3.11/site-packages/colbert/data/queries.py", line 17, in __init__
    self._load_data(data) or self._load_file(path)
                             ^^^^^^^^^^^^^^^^^^^^^
  File "/home/usr/miniconda/envs/colbert2/lib/python3.11/site-packages/colbert/data/queries.py", line 52, in _load_file
    self.data = load_queries(path)
                ^^^^^^^^^^^^^^^^^^
  File "/home/usr/miniconda/envs/colbert2/lib/python3.11/site-packages/colbert/evaluation/loaders.py", line 22, in load_queries
    qid, query, *_ = line.strip().split('\t')
    ^^^^^^^^^^^^^^
ValueError: not enough values to unpack (expected at least 2, got 1)

An odd observation is that when I pass only train_examples[:1000] instead of the whole set, it seems to work and training starts. Train is example is a list of triplet tuples

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant