CS231n - http://cs231n.stanford.edu/
You can check out the summary of lecture and assignments through the following link
Data-driven Approach, K-Nearest Neighbor, train/validation/test splits
Support Vector Machine, Softmax
Stochastic Gradient Descent
4. Backpropagation, Intutitions
chain rule interpretation, real-valued circuits, patterns in gradient flow
5. Neural Networks Part 1: Setting up the Architecture
model of a biological neuron, activation functions, neural net architecture, representational power
6. Neural Networks Part2 : Setting up the Data
preprocessing, weight initialization, batch normalization, regularization (L2/dropout), loss functions
7. Neural Networks Part 3 : Learning and Evaluation
gradient checks, sanity checks, babysitting the learning process, momentum(+nesterov), second-order methods, Adagrad/RMSprop, hyperparameter optimization, model ensembles
8. Convolutional Neural Networks: Architectures, Pooling Layers
layers, spatial arrangement, computational considerations
9. Convolutional Neural Networks: Layer Patterns, Case studies
layer sizing patterns, AlexNet/ZFnet/VGGNet case studies
Higher Level Representations: Image Features
Fully-connected Neural Network
Use modular layer design to implement fully-connected networks of arbitrary depth.
Fully connected Neural Network 2
Implement several popular update rules to optimize these models
implement batch normalization, and use it to train deep fully-connected networks.
Implement Dropout and explore its effects on model generalization.
implement several new layers that are commonly used in convolutional networks.
Learn how the PyTorch works, culminating in training a convolutional network on CIFAR-10
Learn how the TensorFlow works, culminating in training a convolutional network on CIFAR-10
Image Captioning with Vanilla RNNs
Image captioning system on MS-COCO using vanilla recurrent networks
Long-Short Term Memory (LSTM) RNNs, and apply them to image captioning on MS-COCO