-
-
Notifications
You must be signed in to change notification settings - Fork 46.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add automatic differentiation algorithm #10977
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Click here to look at the relevant links ⬇️
🔗 Relevant Links
Repository:
Python:
Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.
algorithms-keeper
commands and options
algorithms-keeper actions can be triggered by commenting on this PR:
@algorithms-keeper review
to trigger the checks for only added pull request files@algorithms-keeper review-all
to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.
machine_learning/auto-diff.py
Outdated
will be calculated. | ||
""" | ||
|
||
def __init__(self, value): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: __init__
. If the function does not return a value, please provide the type hint as: def function() -> None:
Please provide type hint for the parameter: value
machine_learning/auto-diff.py
Outdated
) | ||
return result | ||
|
||
def add_param_to(self, param_to: Operation): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: add_param_to
. If the function does not return a value, please provide the type hint as: def function() -> None:
machine_learning/auto-diff.py
Outdated
def add_param_to(self, param_to: Operation): | ||
self.param_to.append(param_to) | ||
|
||
def add_result_of(self, result_of: Operation): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: add_result_of
. If the function does not return a value, please provide the type hint as: def function() -> None:
machine_learning/auto-diff.py
Outdated
objects and pointer to resulting Variable from the operation. | ||
""" | ||
|
||
def __init__( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: __init__
. If the function does not return a value, please provide the type hint as: def function() -> None:
machine_learning/auto-diff.py
Outdated
self.output: Variable | None = None | ||
self.other_params = {} if other_params is None else other_params | ||
|
||
def add_params(self, params: list[Variable]): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: add_params
. If the function does not return a value, please provide the type hint as: def function() -> None:
machine_learning/auto-diff.py
Outdated
based on the computation graph. | ||
""" | ||
|
||
def __new__(cls): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: __new__
. If the function does not return a value, please provide the type hint as: def function() -> None:
machine_learning/auto-diff.py
Outdated
cls.instance = super().__new__(cls) | ||
return cls.instance | ||
|
||
def __init__(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: __init__
. If the function does not return a value, please provide the type hint as: def function() -> None:
machine_learning/auto-diff.py
Outdated
def __init__(self): | ||
self.enabled = False | ||
|
||
def __enter__(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: __enter__
. If the function does not return a value, please provide the type hint as: def function() -> None:
machine_learning/auto-diff.py
Outdated
self.enabled = True | ||
return self | ||
|
||
def __exit__(self, exc_type, exc_value, tb): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: __exit__
. If the function does not return a value, please provide the type hint as: def function() -> None:
Please provide type hint for the parameter: exc_type
Please provide type hint for the parameter: exc_value
Please provide type hint for the parameter: tb
machine_learning/auto-diff.py
Outdated
def __exit__(self, exc_type, exc_value, tb): | ||
self.enabled = False | ||
|
||
def add_operation( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide return type hint for the function: add_operation
. If the function does not return a value, please provide the type hint as: def function() -> None:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have you checked with Pre --commits
I am getting conflicting suggestions from mypy and ruff: From mypy:
And If I change typing to typing_extensions then ruff shows:
|
Apart from that I am not getting these ones in my local:
|
|
||
from enum import Enum | ||
from types import TracebackType | ||
from typing import Self |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ruff was upgraded for Py3.12 but mypy has not yet been upgraded.
from typing import Self | |
from typing_extensions import Self # noqa: UP035 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Click here to look at the relevant links ⬇️
🔗 Relevant Links
Repository:
Python:
Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.
algorithms-keeper
commands and options
algorithms-keeper actions can be triggered by commenting on this PR:
@algorithms-keeper review
to trigger the checks for only added pull request files@algorithms-keeper review-all
to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.
def add_output(self, output: Variable) -> None: | ||
self.output = output | ||
|
||
def __eq__(self, value) -> bool: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide type hint for the parameter: value
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Click here to look at the relevant links ⬇️
🔗 Relevant Links
Repository:
Python:
Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.
algorithms-keeper
commands and options
algorithms-keeper actions can be triggered by commenting on this PR:
@algorithms-keeper review
to trigger the checks for only added pull request files@algorithms-keeper review-all
to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.
def add_output(self, output: Variable) -> None: | ||
self.output = output | ||
|
||
def __eq__(self, value) -> bool: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide type hint for the parameter: value
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Click here to look at the relevant links ⬇️
🔗 Relevant Links
Repository:
Python:
Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.
algorithms-keeper
commands and options
algorithms-keeper actions can be triggered by commenting on this PR:
@algorithms-keeper review
to trigger the checks for only added pull request files@algorithms-keeper review-all
to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.
def add_output(self, output: Variable) -> None: | ||
self.output = output | ||
|
||
def __eq__(self, value) -> bool: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please provide type hint for the parameter: value
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks really cool. I hope the comments help.
... c = a + b | ||
... d = a * b | ||
... e = c / d | ||
>>> print(tracker.gradient(e, a)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's try to avoid print()
in doctests.
>>> print(tracker.gradient(e, a)) | |
>>> tracker.gradient(e, a) |
Repeat below.
|
||
Reference: https://en.wikipedia.org/wiki/Automatic_differentiation | ||
|
||
Author: Poojan smart |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Your choice...
Author: Poojan smart | |
Author: Poojan Smart |
>>> print(tracker.gradient(e, m)) | ||
None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
>>> print(tracker.gradient(e, m)) | |
None | |
>>> tracker.gradient(e, m) is None | |
True |
Email: smrtpoojan@gmail.com | ||
|
||
Examples: | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general, it is better to put doctests into the Class docstring or function/method docstring so that if someone copies the Class, function, or method then the tests come along for the ride.
Class represents n-dimensional object which is used to wrap | ||
numpy array on which operations will be performed and gradient | ||
will be calculated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We get 88 characters per line max so let's use them to get to fewer lines.
Class represents n-dimensional object which is used to wrap | |
numpy array on which operations will be performed and gradient | |
will be calculated. | |
Class represents n-dimensional object which is used to wrap numpy array on which | |
operations will be performed and the gradient will be calculated. |
derivative = np.ones_like(params[0].numpy(), dtype=np.float64) | ||
else: | ||
derivative = -np.ones_like(params[1].numpy(), dtype=np.float64) | ||
elif operation == OpType.MUL: | ||
derivative = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
derivative = np.ones_like(params[0].numpy(), dtype=np.float64) | |
else: | |
derivative = -np.ones_like(params[1].numpy(), dtype=np.float64) | |
elif operation == OpType.MUL: | |
derivative = ( | |
return np.ones_like(params[0].numpy(), dtype=np.float64) | |
else: | |
return -np.ones_like(params[1].numpy(), dtype=np.float64) | |
if operation == OpType.MUL: | |
return ( |
derivative = ( | ||
params[1].numpy().T if params[0] == param else params[0].numpy().T | ||
) | ||
elif operation == OpType.DIV: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
elif operation == OpType.DIV: | |
if operation == OpType.DIV: |
derivative = 1 / params[1].numpy() | ||
else: | ||
derivative = -params[0].numpy() / (params[1].numpy() ** 2) | ||
elif operation == OpType.MATMUL: | ||
derivative = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
derivative = 1 / params[1].numpy() | |
else: | |
derivative = -params[0].numpy() / (params[1].numpy() ** 2) | |
elif operation == OpType.MATMUL: | |
derivative = ( | |
return 1 / params[1].numpy() | |
else: | |
return -params[0].numpy() / (params[1].numpy() ** 2) | |
if operation == OpType.MATMUL: | |
return ( |
elif operation == OpType.POWER: | ||
power = operation.other_params["power"] | ||
derivative = power * (params[0].numpy() ** (power - 1)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
elif operation == OpType.POWER: | |
power = operation.other_params["power"] | |
derivative = power * (params[0].numpy() ** (power - 1)) | |
if operation == OpType.POWER: | |
power = operation.other_params["power"] | |
return power * (params[0].numpy() ** (power - 1)) |
elif operation == OpType.POWER: | ||
power = operation.other_params["power"] | ||
derivative = power * (params[0].numpy() ** (power - 1)) | ||
return derivative |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to raise ValueError("Invalid operation")
here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Click here to look at the relevant links ⬇️
🔗 Relevant Links
Repository:
Python:
Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.
algorithms-keeper
commands and options
algorithms-keeper actions can be triggered by commenting on this PR:
@algorithms-keeper review
to trigger the checks for only added pull request files@algorithms-keeper review-all
to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.
self.op_type = op_type | ||
self.other_params = {} if other_params is None else other_params | ||
|
||
def add_params(self, params: list[Variable]) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As there is no test file in this pull request nor any test function or class in the file machine_learning/automatic_differentiation.py
, please provide doctest for the function add_params
def add_params(self, params: list[Variable]) -> None: | ||
self.params = params | ||
|
||
def add_output(self, output: Variable) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As there is no test file in this pull request nor any test function or class in the file machine_learning/automatic_differentiation.py
, please provide doctest for the function add_output
def add_output(self, output: Variable) -> None: | ||
self.output = output | ||
|
||
def __eq__(self, value) -> bool: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As there is no test file in this pull request nor any test function or class in the file machine_learning/automatic_differentiation.py
, please provide doctest for the function __eq__
Please provide type hint for the parameter: value
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome contribution! Thanks for doing this.
* Added automatic differentiation algorithm * file name changed * Resolved pre commit errors * updated dependency * added noqa for ignoring check * adding typing_extension for adding Self type in __new__ * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * sorted requirement.text dependency * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * resolved ruff --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Describe your change:
Checklist: