Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error while running requirements.txt file #46

Closed
promiseve opened this issue Oct 27, 2022 · 2 comments
Closed

Error while running requirements.txt file #46

promiseve opened this issue Oct 27, 2022 · 2 comments

Comments

@promiseve
Copy link

promiseve commented Oct 27, 2022

Ran
!pip install -r requirements.txt
Got error
`Building wheels for collected packages: tokenizers
Building wheel for tokenizers (pyproject.toml) ... error
error: subprocess-exited-with-error

× Building wheel for tokenizers (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [51 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-cpython-310
creating build/lib.linux-x86_64-cpython-310/tokenizers
copying py_src/tokenizers/init.py -> build/lib.linux-x86_64-cpython-310/tokenizers
creating build/lib.linux-x86_64-cpython-310/tokenizers/models
copying py_src/tokenizers/models/init.py -> build/lib.linux-x86_64-cpython-310/tokenizers/models
creating build/lib.linux-x86_64-cpython-310/tokenizers/decoders
copying py_src/tokenizers/decoders/init.py -> build/lib.linux-x86_64-cpython-310/tokenizers/decoders
creating build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
copying py_src/tokenizers/normalizers/init.py -> build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
creating build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
copying py_src/tokenizers/pre_tokenizers/init.py -> build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
creating build/lib.linux-x86_64-cpython-310/tokenizers/processors
copying py_src/tokenizers/processors/init.py -> build/lib.linux-x86_64-cpython-310/tokenizers/processors
creating build/lib.linux-x86_64-cpython-310/tokenizers/trainers
copying py_src/tokenizers/trainers/init.py -> build/lib.linux-x86_64-cpython-310/tokenizers/trainers
creating build/lib.linux-x86_64-cpython-310/tokenizers/implementations
copying py_src/tokenizers/implementations/init.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-x86_64-cpython-310/tokenizers/implementations
creating build/lib.linux-x86_64-cpython-310/tokenizers/tools
copying py_src/tokenizers/tools/init.py -> build/lib.linux-x86_64-cpython-310/tokenizers/tools
copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-x86_64-cpython-310/tokenizers/tools
copying py_src/tokenizers/init.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers
copying py_src/tokenizers/models/init.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/models
copying py_src/tokenizers/decoders/init.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/decoders
copying py_src/tokenizers/normalizers/init.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/normalizers
copying py_src/tokenizers/pre_tokenizers/init.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/pre_tokenizers
copying py_src/tokenizers/processors/init.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/processors
copying py_src/tokenizers/trainers/init.pyi -> build/lib.linux-x86_64-cpython-310/tokenizers/trainers
copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-x86_64-cpython-310/tokenizers/tools
running build_ext
running build_rust
error: can't find Rust compiler

  If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
  
  To update pip, run:
  
      pip install --upgrade pip
  
  and then retry package installation.
  
  If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs/) is the recommended way to download and update the Rust compiler toolchain.
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
​`
Has anyone encountered this?
The issues does not fix when i upgrade pip or attempt to use rust compliers

@Alymostafa
Copy link

Try to work on a new env and install the transformers library again. Also, make sure to load and import pyarrow.

@lvwerra
Copy link
Member

lvwerra commented Dec 7, 2022

This seems like an issue with the tokenizers library. Can you install it pip install tokenizers alone?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants