Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed issue with torch 1.12 issue with arange not supporting fp16 for CPU device. #1574

Merged
merged 3 commits into from
Oct 26, 2023

Conversation

BloodAxe
Copy link
Contributor

On torch 1.12 arange_cpu is not implemented for fp16
Fix is to create arange with fp32 and then cast to fp16.

Copy link
Contributor

@Louis-Dupont Louis-Dupont left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@BloodAxe BloodAxe merged commit 1f15c76 into master Oct 26, 2023
3 checks passed
@BloodAxe BloodAxe deleted the feature/SG-000-fix-arange-dtype branch October 26, 2023 08:56
BloodAxe added a commit that referenced this pull request Oct 26, 2023
BloodAxe added a commit that referenced this pull request Oct 26, 2023
* [Improvement] max_batches support to training log and tqdm progress bar. (#1554)

* Added max_batches support to training log and tqdm progress bar.

* Added changing string in accordance which parameter is used (len(loader) of max_batches)

* Replaced stopping condition for the epoch with a smaller one

(cherry picked from commit 749a9c7)

* fix (#1558)

Co-authored-by: Eugene Khvedchenya <ekhvedchenya@gmail.com>
(cherry picked from commit 8a1d255)

* fix (#1564)

(cherry picked from commit 24798b0)

* Bugfix of model.export() to work correct with bs>1 (#1551)

(cherry picked from commit 0515496)

* Fixed incorrect automatic variable used (#1565)

$@ is the name of the target being generated, and $^ are the dependencies

Co-authored-by: Louis-Dupont <35190946+Louis-Dupont@users.noreply.github.com>
(cherry picked from commit 43f8bea)

* fix typo in class documentation (#1548)

Co-authored-by: Eugene Khvedchenya <ekhvedchenya@gmail.com>
Co-authored-by: Louis-Dupont <35190946+Louis-Dupont@users.noreply.github.com>
(cherry picked from commit ec21383)

* Feature/sg 1198 mixed precision automatically changed with warning (#1567)

* fix

* work with tmpdir

* minor change of comment

* improve device_config

(cherry picked from commit 34fda6c)

* Fixed issue with torch 1.12 where _scale_fn_ref is missing in CyclicLR (#1575)

(cherry picked from commit 23b4f7a)

* Fixed issue with torch 1.12 issue with arange not supporting fp16 for CPU device. (#1574)

(cherry picked from commit 1f15c76)

---------

Co-authored-by: hakuryuu96 <marchenkophilip@gmail.com>
Co-authored-by: Louis-Dupont <35190946+Louis-Dupont@users.noreply.github.com>
Co-authored-by: Alessandro Ros <aler9.dev@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants