Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lazy load DTypeFloatTorch #173

Merged
merged 3 commits into from
Apr 3, 2024

Conversation

rjavadi
Copy link
Contributor

@rjavadi rjavadi commented Mar 14, 2024

As I discussed in the previous PRs we need to import baybe.utils.numerical lazily to completely get rid of torch.

I didn't touch surrogates because it's already fixed in previous PR.

@rjavadi rjavadi force-pushed the refactor/lazy-load-torch-numerical-types branch 2 times, most recently from d23da76 to 78bc282 Compare March 28, 2024 15:52
@Scienfitz Scienfitz added the enhancement Expand / change existing functionality label Apr 3, 2024
@AdrianSosic AdrianSosic changed the title Lazy load baybe.utils.numerical Lazy load DTypeFloatTorch Apr 3, 2024
Copy link
Collaborator

@AdrianSosic AdrianSosic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See comment below

baybe/utils/torch.py Show resolved Hide resolved
@AdrianSosic AdrianSosic force-pushed the refactor/lazy-load-torch-numerical-types branch from 5b42a6a to 1891920 Compare April 3, 2024 07:56
@AdrianSosic AdrianSosic force-pushed the refactor/lazy-load-torch-numerical-types branch from 1891920 to abdfe38 Compare April 3, 2024 10:55
@AdrianSosic AdrianSosic merged commit 82bed24 into main Apr 3, 2024
10 checks passed
@AdrianSosic AdrianSosic deleted the refactor/lazy-load-torch-numerical-types branch April 3, 2024 11:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Expand / change existing functionality
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants