Skip to content

Commit

Permalink
⬆️ bump to latest transformers release
Browse files Browse the repository at this point in the history
Signed-off-by: Joe Runde <Joseph.Runde@ibm.com>
  • Loading branch information
joerunde committed Apr 18, 2024
1 parent ae97409 commit ec34f4f
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 4 deletions.
3 changes: 0 additions & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -281,9 +281,6 @@ RUN --mount=type=bind,from=flash-att-v2-cache,src=/usr/src/flash-attention-v2,ta
# or are using a PyTorch nightly version
RUN pip install auto-gptq=="${AUTO_GPTQ_VERSION}" --no-cache-dir

# Install pre-release version of transformers
RUN pip install tokenizers==0.19.1 git+https://github.com/huggingface/transformers.git@ec92f983af5295fc92414a37b988d8384785988a

# Install server
# git is required to pull the fms-extras dependency
RUN dnf install -y git && dnf clean all
Expand Down
2 changes: 1 addition & 1 deletion server/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ safetensors = "^0.4.3"
sentencepiece = "^0.2.0"
datasets = { version = "^2.15.0", optional = true }
texttable = { version = "^1.7.0", optional = true }
#transformers = "4.38.2"
transformers = "4.40.0"
optimum = { version = "^1.18.0", extras = ["onnxruntime-gpu"], optional = true }
onnxruntime = { version = "^1.17.1", optional = true }
onnxruntime-gpu = { version = "^1.17.1", optional = true }
Expand Down

0 comments on commit ec34f4f

Please sign in to comment.