During my studies I prepared a set of Jupyter Notebooks with some of theories and codes explaining many concepts of Neural Networks and Deep Learning. Besides those materials, I have also added some implementations of papers of the area. You can find them below as a reference for your studies.
- My Presentations
- Linear Regression
- Logistic Regression
- Polynomial Regression
- Gradient Descent
- Neural Networks
- Deep Neural Networks
- Regularization
- Convolutional Neural Networks
- Residual Neural Networks
- Practical examples
- Image Classification
- Food Classification
- Image Classification
- Supporting Materials:
- VII Workshop on Information Systems [October 20th, 2017 - Palmas(TO) Brazil]: In this 2-hour-long lecture, I show the evolution of Neural Nets, explaining gradient descent, backpropagation, how to apply convolutions on images, Convolutional Neural Networks (CNN) famous topologies and some real-world applications. Its content is an overview of the world of Deep Learning that could help you understand a few basic things and motivate you to reach further topics. This presentation can be downloaded here.
- Concepts and Theory
- Finding analytic solution using calculus and representation with a neuron
- Notebook: Linear Regression_Theory
- Solving Linear Regression problems using Tensorflow, Keras and analytic solution
- Learning from Data: These are the 18 video lectures of the course Learning from Data by Yaser Abu-Mostafa from Caltech University. Here you can learn the concepts, theory and different algorithms of Machine Learning explained in detail. Here you can find the recommended textbook that covers most of these lectures (Recorded between April and May 2012)
- Stanford CS231n: Stanford's most popular class on Deep Learning. This 16-lecture course dives into details of deep learning architectures focusing on image classification. Instructors Fei-Fei Li, Justin Johnson and Serena Yeung (Spring 2017)
- Learning from Data: One of the most popular books on Machine Learning. It gives clear explanations mixing theory and practical contents that can help you understang fundamentals of ML like VC dimension, regularization, overfitting and linear models.