Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add automatic differentiation algorithm #10977

Merged
merged 14 commits into from
Oct 27, 2023

Conversation

PoojanSmart
Copy link
Contributor

Describe your change:

  • Adding implementation of automatic differentiation algorithm which is useful for calculating gradients automatically. This algorithm is widely used in machine learning and deep learning models for optimization.
  • Add an algorithm?
  • Fix a bug or typo in an existing algorithm?
  • Add or change doctests? -- Note: Please avoid changing both code and tests in a single pull request.
  • Documentation change?

Checklist:

  • I have read CONTRIBUTING.md.
  • This pull request is all my own work -- I have not plagiarized.
  • I know that pull requests will not be merged if they fail the automated tests.
  • This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
  • All new Python files are placed inside an existing directory.
  • All filenames are in all lowercase characters with no spaces or dashes.
  • All functions and variable names follow Python naming conventions.
  • All function parameters and return values are annotated with Python type hints.
  • All functions have doctests that pass the automated testing.
  • All new algorithms include at least one URL that points to Wikipedia or another similar explanation.
  • If this pull request resolves one or more open issues then the description above includes the issue number(s) with a closing keyword: "Fixes #ISSUE-NUMBER".

@algorithms-keeper algorithms-keeper bot added the require type hints https://docs.python.org/3/library/typing.html label Oct 26, 2023
Copy link

@algorithms-keeper algorithms-keeper bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Click here to look at the relevant links ⬇️

🔗 Relevant Links

Repository:

Python:

Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.

algorithms-keeper commands and options

algorithms-keeper actions can be triggered by commenting on this PR:

  • @algorithms-keeper review to trigger the checks for only added pull request files
  • @algorithms-keeper review-all to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.

NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.

will be calculated.
"""

def __init__(self, value):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: __init__. If the function does not return a value, please provide the type hint as: def function() -> None:

Please provide type hint for the parameter: value

)
return result

def add_param_to(self, param_to: Operation):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: add_param_to. If the function does not return a value, please provide the type hint as: def function() -> None:

def add_param_to(self, param_to: Operation):
self.param_to.append(param_to)

def add_result_of(self, result_of: Operation):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: add_result_of. If the function does not return a value, please provide the type hint as: def function() -> None:

objects and pointer to resulting Variable from the operation.
"""

def __init__(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: __init__. If the function does not return a value, please provide the type hint as: def function() -> None:

self.output: Variable | None = None
self.other_params = {} if other_params is None else other_params

def add_params(self, params: list[Variable]):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: add_params. If the function does not return a value, please provide the type hint as: def function() -> None:

based on the computation graph.
"""

def __new__(cls):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: __new__. If the function does not return a value, please provide the type hint as: def function() -> None:

cls.instance = super().__new__(cls)
return cls.instance

def __init__(self):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: __init__. If the function does not return a value, please provide the type hint as: def function() -> None:

def __init__(self):
self.enabled = False

def __enter__(self):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: __enter__. If the function does not return a value, please provide the type hint as: def function() -> None:

self.enabled = True
return self

def __exit__(self, exc_type, exc_value, tb):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: __exit__. If the function does not return a value, please provide the type hint as: def function() -> None:

Please provide type hint for the parameter: exc_type

Please provide type hint for the parameter: exc_value

Please provide type hint for the parameter: tb

def __exit__(self, exc_type, exc_value, tb):
self.enabled = False

def add_operation(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide return type hint for the function: add_operation. If the function does not return a value, please provide the type hint as: def function() -> None:

@algorithms-keeper algorithms-keeper bot added awaiting reviews This PR is ready to be reviewed tests are failing Do not merge until tests pass labels Oct 26, 2023
Copy link
Contributor

@imSanko imSanko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have you checked with Pre --commits

@algorithms-keeper algorithms-keeper bot removed the tests are failing Do not merge until tests pass label Oct 26, 2023
@PoojanSmart
Copy link
Contributor Author

I am getting conflicting suggestions from mypy and ruff:

From mypy:

machine_learning\automatic_differentiation.py:45: error: Module "typing" has no attribute "Self"  [attr-defined]
machine_learning\automatic_differentiation.py:45: note: Use `from typing_extensions import Self` instead

And If I change typing to typing_extensions then ruff shows:

machine_learning\automatic_differentiation.py:45:1: UP035 [*] Import from `typing` instead: `Self`
   |
43 | from enum import Enum
44 | from types import TracebackType
45 | from typing_extensions import Self
   | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UP035
46 |
47 | import numpy as np
   |
   = help: Import from `typing`

@PoojanSmart
Copy link
Contributor Author

Apart from that I am not getting these ones in my local:

machine_learning/automatic_differentiation.py:195: error: Access to generic instance variables via class is ambiguous [misc] machine_learning/automatic_differentiation.py:195: error: Incompatible return value type (got "GradientTracker", expected "Self") [return-value]

@imSanko / @cclauss can you please help?


from enum import Enum
from types import TracebackType
from typing import Self
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ruff was upgraded for Py3.12 but mypy has not yet been upgraded.

Suggested change
from typing import Self
from typing_extensions import Self # noqa: UP035

Copy link

@algorithms-keeper algorithms-keeper bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Click here to look at the relevant links ⬇️

🔗 Relevant Links

Repository:

Python:

Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.

algorithms-keeper commands and options

algorithms-keeper actions can be triggered by commenting on this PR:

  • @algorithms-keeper review to trigger the checks for only added pull request files
  • @algorithms-keeper review-all to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.

NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.

def add_output(self, output: Variable) -> None:
self.output = output

def __eq__(self, value) -> bool:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide type hint for the parameter: value

@algorithms-keeper algorithms-keeper bot added the tests are failing Do not merge until tests pass label Oct 26, 2023
Copy link

@algorithms-keeper algorithms-keeper bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Click here to look at the relevant links ⬇️

🔗 Relevant Links

Repository:

Python:

Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.

algorithms-keeper commands and options

algorithms-keeper actions can be triggered by commenting on this PR:

  • @algorithms-keeper review to trigger the checks for only added pull request files
  • @algorithms-keeper review-all to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.

NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.

def add_output(self, output: Variable) -> None:
self.output = output

def __eq__(self, value) -> bool:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide type hint for the parameter: value

Copy link

@algorithms-keeper algorithms-keeper bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Click here to look at the relevant links ⬇️

🔗 Relevant Links

Repository:

Python:

Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.

algorithms-keeper commands and options

algorithms-keeper actions can be triggered by commenting on this PR:

  • @algorithms-keeper review to trigger the checks for only added pull request files
  • @algorithms-keeper review-all to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.

NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.

def add_output(self, output: Variable) -> None:
self.output = output

def __eq__(self, value) -> bool:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide type hint for the parameter: value

@algorithms-keeper algorithms-keeper bot removed the tests are failing Do not merge until tests pass label Oct 26, 2023
Copy link
Member

@cclauss cclauss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks really cool. I hope the comments help.

... c = a + b
... d = a * b
... e = c / d
>>> print(tracker.gradient(e, a))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's try to avoid print() in doctests.

Suggested change
>>> print(tracker.gradient(e, a))
>>> tracker.gradient(e, a)

Repeat below.


Reference: https://en.wikipedia.org/wiki/Automatic_differentiation

Author: Poojan smart
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your choice...

Suggested change
Author: Poojan smart
Author: Poojan Smart

Comment on lines 22 to 23
>>> print(tracker.gradient(e, m))
None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
>>> print(tracker.gradient(e, m))
None
>>> tracker.gradient(e, m) is None
True

Email: smrtpoojan@gmail.com

Examples:

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In general, it is better to put doctests into the Class docstring or function/method docstring so that if someone copies the Class, function, or method then the tests come along for the ride.

Comment on lines 52 to 54
Class represents n-dimensional object which is used to wrap
numpy array on which operations will be performed and gradient
will be calculated.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We get 88 characters per line max so let's use them to get to fewer lines.

Suggested change
Class represents n-dimensional object which is used to wrap
numpy array on which operations will be performed and gradient
will be calculated.
Class represents n-dimensional object which is used to wrap numpy array on which
operations will be performed and the gradient will be calculated.

Comment on lines 299 to 303
derivative = np.ones_like(params[0].numpy(), dtype=np.float64)
else:
derivative = -np.ones_like(params[1].numpy(), dtype=np.float64)
elif operation == OpType.MUL:
derivative = (
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
derivative = np.ones_like(params[0].numpy(), dtype=np.float64)
else:
derivative = -np.ones_like(params[1].numpy(), dtype=np.float64)
elif operation == OpType.MUL:
derivative = (
return np.ones_like(params[0].numpy(), dtype=np.float64)
else:
return -np.ones_like(params[1].numpy(), dtype=np.float64)
if operation == OpType.MUL:
return (

derivative = (
params[1].numpy().T if params[0] == param else params[0].numpy().T
)
elif operation == OpType.DIV:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
elif operation == OpType.DIV:
if operation == OpType.DIV:

Comment on lines 308 to 312
derivative = 1 / params[1].numpy()
else:
derivative = -params[0].numpy() / (params[1].numpy() ** 2)
elif operation == OpType.MATMUL:
derivative = (
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
derivative = 1 / params[1].numpy()
else:
derivative = -params[0].numpy() / (params[1].numpy() ** 2)
elif operation == OpType.MATMUL:
derivative = (
return 1 / params[1].numpy()
else:
return -params[0].numpy() / (params[1].numpy() ** 2)
if operation == OpType.MATMUL:
return (

Comment on lines 315 to 317
elif operation == OpType.POWER:
power = operation.other_params["power"]
derivative = power * (params[0].numpy() ** (power - 1))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
elif operation == OpType.POWER:
power = operation.other_params["power"]
derivative = power * (params[0].numpy() ** (power - 1))
if operation == OpType.POWER:
power = operation.other_params["power"]
return power * (params[0].numpy() ** (power - 1))

elif operation == OpType.POWER:
power = operation.other_params["power"]
derivative = power * (params[0].numpy() ** (power - 1))
return derivative
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to raise ValueError("Invalid operation") here?

@cclauss cclauss self-assigned this Oct 26, 2023
@algorithms-keeper algorithms-keeper bot added the tests are failing Do not merge until tests pass label Oct 27, 2023
@algorithms-keeper algorithms-keeper bot added the require tests Tests [doctest/unittest/pytest] are required label Oct 27, 2023
Copy link

@algorithms-keeper algorithms-keeper bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Click here to look at the relevant links ⬇️

🔗 Relevant Links

Repository:

Python:

Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.

algorithms-keeper commands and options

algorithms-keeper actions can be triggered by commenting on this PR:

  • @algorithms-keeper review to trigger the checks for only added pull request files
  • @algorithms-keeper review-all to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.

NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.

self.op_type = op_type
self.other_params = {} if other_params is None else other_params

def add_params(self, params: list[Variable]) -> None:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file machine_learning/automatic_differentiation.py, please provide doctest for the function add_params

def add_params(self, params: list[Variable]) -> None:
self.params = params

def add_output(self, output: Variable) -> None:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file machine_learning/automatic_differentiation.py, please provide doctest for the function add_output

def add_output(self, output: Variable) -> None:
self.output = output

def __eq__(self, value) -> bool:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file machine_learning/automatic_differentiation.py, please provide doctest for the function __eq__

Please provide type hint for the parameter: value

@algorithms-keeper algorithms-keeper bot removed the tests are failing Do not merge until tests pass label Oct 27, 2023
@cclauss cclauss enabled auto-merge (squash) October 27, 2023 08:46
Copy link
Member

@cclauss cclauss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome contribution! Thanks for doing this.

@cclauss cclauss merged commit 5987f86 into TheAlgorithms:master Oct 27, 2023
@algorithms-keeper algorithms-keeper bot removed the awaiting reviews This PR is ready to be reviewed label Oct 27, 2023
sedatguzelsemme pushed a commit to sedatguzelsemme/Python that referenced this pull request Sep 15, 2024
* Added automatic differentiation algorithm

* file name changed

* Resolved pre commit errors

* updated dependency

* added noqa for ignoring check

* adding typing_extension for adding Self type in __new__

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* sorted requirement.text dependency

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* resolved ruff

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
@isidroas isidroas mentioned this pull request Jan 25, 2025
14 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
require tests Tests [doctest/unittest/pytest] are required require type hints https://docs.python.org/3/library/typing.html
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants