Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Characters Exceed error #9

Closed
salmanahmed1993 opened this issue Aug 19, 2020 · 2 comments
Closed

Characters Exceed error #9

salmanahmed1993 opened this issue Aug 19, 2020 · 2 comments

Comments

@salmanahmed1993
Copy link

Hi There,
When I try to run following codes the below error pop-ups
python -m lit_nlp.examples.quickstart_sst_demo --port=5432

Traceback (most recent call last):
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\download\extractor.py", line 96, in _sync_extract
_copy(handle, dst_path)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\download\extractor.py", line 120, in _copy
tf.io.gfile.makedirs(os.path.dirname(dest_path))
File "C:\Users\SB00790107\AppData\Roaming\Python\Python37\site-packages\tensorflow_core\python\lib\io\file_io.py", line 453, in recursive_create_dir_v2
pywrap_tensorflow.RecursivelyCreateDir(compat.as_bytes(path))
tensorflow.python.framework.errors_impl.NotFoundError: Failed to create a directory: C:\Users\SB00790107\tensorflow_datasets\downloads\extracted/ZIP.fire.goog.com_v0_b_mtl-sent-repr.apps.cowOhVrpNUsvqdZqI70Nq3ISu63l9SOhTqYqoz6uEW3-Y.zipalt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8.incomplete_411bf32dcb704a46b9e93a63fb1aae2c/SST-2; No such file or directory

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\runpy.py", line 193, in _run_module_as_main
"main", mod_spec)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:~\lit\lit_nlp\examples\pretrained_lm_demo.py", line 102, in
app.run(main)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\absl\app.py", line 299, in run
_run_main(main, args)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\absl\app.py", line 250, in _run_main
sys.exit(main(argv))
File "C:~\lit\lit_nlp\examples\pretrained_lm_demo.py", line 77, in main
"sst_dev": glue.SST2Data("validation").remap({"sentence": "text"}),
File "C:~\lit\lit_nlp\examples\datasets\glue.py", line 62, in init
for ex in load_tfds('glue/sst2', split=split):
File "C:~\lit\lit_nlp\examples\datasets\glue.py", line 21, in load_tfds
ret = list(tfds.as_numpy(tfds.load(*args, download=True, try_gcs=True, **kw)))
File "C:\Users\SB00790107\AppData\Roaming\Python\Python37\site-packages\wrapt\wrappers.py", line 567, in call
args, kwargs)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\api_utils.py", line 69, in disallow_positional_args_dec
return fn(*args, **kwargs)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\registered.py", line 371, in load
dbuilder.download_and_prepare(**download_and_prepare_kwargs)
File "C:\Users\SB00790107\AppData\Roaming\Python\Python37\site-packages\wrapt\wrappers.py", line 606, in call
args, kwargs)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\api_utils.py", line 69, in disallow_positional_args_dec
return fn(*args, **kwargs)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\dataset_builder.py", line 376, in download_and_prepare
download_config=download_config)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\dataset_builder.py", line 1019, in _download_and_prepare
max_examples_per_split=download_config.max_examples_per_split,
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\dataset_builder.py", line 939, in _download_and_prepare
dl_manager, **split_generators_kwargs):
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\text\glue.py", line 452, in _split_generators
dl_dir = dl_manager.download_and_extract(self.builder_config.data_url)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\download\download_manager.py", line 604, in download_and_extract
return _map_promise(self._download_extract, url_or_urls)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\download\download_manager.py", line 641, in _map_promise
res = tf.nest.map_structure(_wait_on_promise, all_promises)
File "C:\Users\SB00790107\AppData\Roaming\Python\Python37\site-packages\tensorflow_core\python\util\nest.py", line 535, in map_structure
structure[0], [func(*x) for x in entries],
File "C:\Users\SB00790107\AppData\Roaming\Python\Python37\site-packages\tensorflow_core\python\util\nest.py", line 535, in
structure[0], [func(*x) for x in entries],
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\download\download_manager.py", line 635, in _wait_on_promise
return p.get()
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\promise\promise.py", line 512, in get
return self._target_settled_value(_raise=True)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\promise\promise.py", line 516, in _target_settled_value
return self._target()._settled_value(_raise)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\promise\promise.py", line 226, in _settled_value
reraise(type(raise_val), raise_val, self._traceback)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\six.py", line 703, in reraise
raise value
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\promise\promise.py", line 844, in handle_future_result
resolve(future.result())
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\concurrent\futures_base.py", line 428, in result
return self.__get_result()
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\concurrent\futures_base.py", line 384, in __get_result
raise self._exception
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\concurrent\futures\thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "C:\Users\SB00790107\AppData\Local\Continuum\anaconda3\envs\lit-nlp\lib\site-packages\tensorflow_datasets\core\download\extractor.py", line 108, in _sync_extract
raise ExtractError(msg)
tensorflow_datasets.core.download.extractor.ExtractError: Error while extracting C:\Users\SB00790107\tensorflow_datasets\downloads\fire.goog.com_v0_b_mtl-sent-repr.apps.cowOhVrpNUsvqdZqI70Nq3ISu63l9SOhTqYqoz6uEW3-Y.zipalt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8 to C:\Users\SB00790107\tensorflow_datasets\downloads\extracted\ZIP.fire.goog.com_v0_b_mtl-sent-repr.apps.cowOhVrpNUsvqdZqI70Nq3ISu63l9SOhTqYqoz6uEW3-Y.zipalt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8 (file: SST-2\dev.tsv) : Failed to create a directory: C:\Users\SB00790107\tensorflow_datasets\downloads\extracted/ZIP.fire.goog.com_v0_b_mtl-sent-repr.apps.cowOhVrpNUsvqdZqI70Nq3ISu63l9SOhTqYqoz6uEW3-Y.zipalt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8.incomplete_411bf32dcb704a46b9e93a63fb1aae2c/SST-2; No such file or directory
On windows, path lengths greater than 260 characters may result in an error. See the doc to remove the limiration: https://docs.python.org/3/using/windows.html#removing-the-max-path-limitation

@tolga-b
Copy link
Collaborator

tolga-b commented Aug 19, 2020

Hi salmanahmed1993,
The error seems to happen outside of LIT while downloading the dataset in tensorflow_datasets package. I personally have not tried using LIT in Windows, but it seems that the directory path is longer than 260 characters and that makes the dataset download fail.
Have you tried removing the max path length limitation as suggested in https://docs.python.org/3/using/windows.html#removing-the-max-path-limitation? That may help if you are using Windows 10 and NTFS drives I believe.

@salmanahmed1993
Copy link
Author

  1. Open the Local Group Policy Editor.

  2. In the left pane of the local group policy editor, navigate to the location below.

Computer Configuration\Administrative Templates\System\Filesystem

  1. In the right pane of Filesystem in Local Group Policy Editor, double click/tap on the Enable Win32 long paths policy to edit it.

  2. Do step 5 (enable) or step 6 (disable) below for what you would like to do.

  3. To Enable Win32 Long Paths

A) Select (dot) Enabled, click/tap on OK, and go to step 7 below.

  1. To Disable Win32 Long Paths

A) Select (dot) Not Configured or Disabled, click/tap on OK, and go to step 7 below.

note: Not Configured is the default setting.

Enable or Disable Win32 Long Paths in Windows 10-win32_long_paths_gpedit-2.png

  1. When finished, you can close the local group policy editor if you like.

aryan1107 added a commit that referenced this issue Aug 2, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants