In these short codes I implemented an example of the Adam gradient descent optimization algorithm.
The optimization is done on a simple convex cost function f(x,y) = x^2 + y^2.
python3 space_2d.py-> Plot the mathematical space in 2Dpython3 space_3d.py-> Plot the mathematical space in 3Dpython3 adam_implementation-> Run the Adam Gradient Optimization over the function



- https://machinelearningmastery.com/adam-optimization-from-scratch/, Code Adam Optimization Algorithm From Scratch, Jason Brownlee, 2021
- https://arxiv.org/abs/1412.6980, Adam: A Method for Stochastic Optimization, Diederik P. Kingma, Jimmy Ba, 2014