Machine Learning by Andrew Ng on Coursera
$J(\theta)$ - Gradient descent
-
$J(\theta)$ with multiple variables- Study about learning rate
- Feature Normalization
- Normal Equations
plotData.m
- Function to plot 2D classification datasigmoid.m
- Sigmoid FunctioncostFunction.m
- Logistic Regression Cost Functionpredict.m
- Logistic Regression Prediction FunctioncostFunctionReg.m
- Regularized Logistic Regression Cost
lrCostFunction.m
- Logistic regression cost functiononeVsAll.m
- Train a one-vs-all multi-class classifierpredictOneVsAll.m
- Predict using a one-vs-all multi-class classifierpredict.m
- Neural network prediction function
sigmoidGradient.m
- Compute the gradient of the sigmoid functionrandInitializeWeights.m
- Randomly initialize weightsnnCostFunction.m
- Neural network cost function
linearRegCostFunction.m
- Regularized linear regression cost functionlearningCurve.m
- Generates a learning curvepolyFeatures.m
- Maps data into polynomial feature spacevalidationCurve.m
- Generates a cross validation curve