This is the code for "Intro - The Math of Intelligence" by Siraj Raval on Youtube
This week's coding challenge is to implement gradient descent to find the line of best fit that predicts the relationship between 2 variables of your choice from a kaggle dataset. Bonus points for detailed documentation. Good luck! Post your github link in the youtube comments section
This is the code for this video on Youtube by Siraj Raval. The dataset represents distance cycled vs calories burned. We'll create the line of best fit (linear regression) via gradient descent to predict the mapping. yes, I left out talking about the learning rate in the video, we're not ready to talk about that yet.
Here are some helpful links:
https://spin.atomicobject.com/wp-content/uploads/linear_regression_error1.png
https://spin.atomicobject.com/wp-content/uploads/linear_regression_gradient1.png
- numpy
Python 2 and 3 both work for this. Use pip to install any dependencies.
Just run python3 demo.py
to see the results:
Starting gradient descent at b = 0, m = 0, error = 5565.107834483211
Running...
After 1000 iterations b = 0.08893651993741346, m = 1.4777440851894448, error = 112.61481011613473
Credits for this code go to mattnedrich. I've merely created a wrapper to get people started.