Skip to content

Commit

Permalink
update "transformers" version
Browse files Browse the repository at this point in the history
  • Loading branch information
rachtibat committed Nov 11, 2024
1 parent 6289a90 commit f802536
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ $ git clone https://github.com/rachtibat/LRP-eXplains-Transformers
$ pip install ./LRP-eXplains-Transformers
```

Tested with ``transformers==4.44.0``, ``torch==2.1.0``, ``python==3.11``
Tested with ``transformers==4.46.2``, ``torch==2.1.0``, ``python==3.11``

### 💡 How does the code work?
Layer-wise Relevance Propagation is a rule-based backpropagation algorithm. This means, that we can implement LRP in a single backward pass!
Expand Down
4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,10 @@

setuptools.setup(
name='lxt',
version='0.6.0',
version='0.6.1',
install_requires=[
'torch',
'transformers',
'transformers>=4.46.2',
'accelerate',
'tabulate',
'matplotlib',
Expand Down

0 comments on commit f802536

Please sign in to comment.