-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TypeError with RandAugmentIterDataPipe class during training #45
Comments
With which Python version are you executing the script? And are all the package versions according to the installation instructions? |
I am using python 3.9.18 and I have followed the installation instructions closely. |
Also, for more information, I am importing dill because without it I faced another error. Traceback (most recent call last): caused from class _DataPipeSerializationWrapper:
Is using dill just passing on the problem? or is this a class inheritance problem? |
Very strange. Two followup questions. |
I am using Windows with python virtual environment set up as instructed and I am not using Jupiter notebook. Below is the command that I used to run the training. It is mostly the same as instruction but I changed batch sizes. |
I cannot reproduce this on my machine. Is it possible for you to create a minimal reproducible example? That would help a ton for debugging |
The following fix should work: #54 (comment) |
Hi,
I am trying to run training but facing a TypeError when RandAugmentIterDataPipe calls super().init()
class RandAugmentIterDataPipe(IterDataPipe):
def init(self, source_dp: IterDataPipe, dataset_config: DictConfig):
super().init()
self.source_dp = source_dp
below is the error message:
Original Traceback (most recent call last):
File "C:\Users\lkxv3\miniconda3\envs\rvt\lib\site-packages\torch\utils\data_utils\worker.py", line 252, in _worker_loop
fetcher = _DatasetKind.create_fetcher(dataset_kind, dataset, auto_collation, collate_fn, drop_last)
File "C:\Users\lkxv3\miniconda3\envs\rvt\lib\site-packages\torch\utils\data\dataloader.py", line 80, in create_fetcher
return _utils.fetch._IterableDatasetFetcher(dataset, auto_collation, collate_fn, drop_last)
File "C:\Users\lkxv3\miniconda3\envs\rvt\lib\site-packages\torch\utils\data_utils\fetch.py", line 21, in init
self.dataset_iter = iter(dataset)
File "C:\Users\lkxv3\miniconda3\envs\rvt\lib\site-packages\torch\utils\data\datapipes_hook_iterator.py", line 230, in wrap_iter
iter_ret = func(args, **kwargs)
File "C:\Users\lkxv3\miniconda3\envs\rvt\lib\site-packages\torch\utils\data\datapipes\datapipe.py", line 364, in iter
self._datapipe_iter = iter(self._datapipe)
File "C:\Users\lkxv3\miniconda3\envs\rvt\lib\site-packages\torch\utils\data\datapipes_hook_iterator.py", line 230, in wrap_iter
iter_ret = func(args, **kwargs)
File "C:\Users\lkxv3\OneDrive\Desktop\CP5105\original\RVT\data\utils\stream_concat_datapipe.py", line 103, in iter
return iter(self._get_zipped_streams_with_worker_id())
File "C:\Users\lkxv3\OneDrive\Desktop\CP5105\original\RVT\data\utils\stream_concat_datapipe.py", line 97, in _get_zipped_streams_with_worker_id
zipped_stream = self._get_zipped_streams(datapipe_list=self.datapipe_list, batch_size=self.batch_size)
File "C:\Users\lkxv3\OneDrive\Desktop\CP5105\original\RVT\data\utils\stream_concat_datapipe.py", line 70, in _get_zipped_streams
streams = Zipper((Concater((self.augmentation_dp(x.to_iter_datapipe())
File "C:\Users\lkxv3\OneDrive\Desktop\CP5105\original\RVT\data\utils\stream_concat_datapipe.py", line 70, in
streams = Zipper((Concater((self.augmentation_dp(x.to_iter_datapipe())
File "C:\Users\lkxv3\OneDrive\Desktop\CP5105\original\RVT\data\utils\stream_concat_datapipe.py", line 70, in
streams = Zipper((Concater((self.augmentation_dp(x.to_iter_datapipe())
File "C:\Users\lkxv3\OneDrive\Desktop\CP5105\original\RVT\data\genx_utils\sequence_for_streaming.py", line 190, in init
super().init()
TypeError: super(type, obj): obj must be an instance or subtype of type
I tried to resolve it byexplicitly calling
super(RandAugmentIterDataPipe, self).init() instead, but it results in the same error.
Could you help to identify what is wrong here?
Thank you.
The text was updated successfully, but these errors were encountered: