This repository contains examples of deep learning algorithms implemented in Python with mathematics behind them being explained.
- For Machine Learning algorithms please check Machine Learning repository.
- For Natural Language Processing (NLU = NLP + NLG) please check Natural Language Processing repository.
- For Computer Vision please check Computer Vision repository.
- CS 231N: Convolutional Neural Networks for Visual Recognition, Stanford
- CS 224N: Natural Language Processing with Deep Learning, Stanford
- Machine Learning Crash Course
- fast.ai: Practical Deep Learning for Coders"
- CS 285: Deep Reinforcement Learning, UC Berkeley
- CSC 2541: Differentiable Inference and Generative Models
- MIT 6.S191: Introduction to Deep Learning
- Frontiers of Deep Learning (Simons Institute)
- New Deep Learning Techniques
- Geometry of Deep Learning (Microsoft Research)
- Deep Multi-Task and Meta Learning (Stanford CS330)
- Advanced Deep Learning & Reinforcement Learning 2020 (DeepMind / UCL)
- Deep Reinforcment Learning, Decision Making and Control (UC Berkeley CS285)
- Full Stack Deep Learning 2019
- Emerging Challenges in Deep Learning
- Deep|Bayes 2019 Summer School
- Workshop on Theory of Deep Learning: Where next (Institure for Advanced Study)
- Deep Learning: Alchemy or Science? (Institure for Advanced Study)
List of Coursera Courses
List of Books
Title | Description, Information |
---|---|
Deep Learning Papers Reading Roadmap | Deep Learning papers reading roadmap for anyone who are eager to learn this amazing tech! |
Other useful links
Title | Description, Information |
---|---|
NVIDIA Deep Learning Examples for Tensor Cores | Deep Learning Examples |
Other useful links
Other useful links
- Caffe – a fast open framework for deep learning;
- Deep Learning - Ian Goodfellow, Yoshua Bengio, and Aaron Courville (2016);
- Deep Learning от Google — короткий курс для продвинутых. Основное внимание уделяется библиотеке для глубинного обучения TensorFlow;
- Deep Learning at Oxford (2015) – a YouTube playlist with lectures (read more);
- awesome-deep-vision – a curated list of deep learning resources for computer vision;
- awesome-deep-learning-papers – a curated list of the most cited deep learning papers (since 2010);
- Deep Learning Tutorials – notes and code;
- dl-docker – an all-in-one Docker image for deep learning. Contains all the popular DL frameworks (TensorFlow, Theano, Torch, Caffe, etc.);
- Self-Study Courses for Deep Learning от NVDIA — self-paced classes for deep learning that feature interactive lectures, hands-on exercises, and recordings of the office hours Q&A with instructors. You’ll learn everything you need to design, train, and integrate neural network-powered artificial intelligence into your applications with widely used open-source frameworks and NVIDIA software. During the hands-on exercises, you will use GPUs and deep learning software in the cloud;
- deep-rl-tensorflow - ensorFlow implementation of Deep Reinforcement Learning papers;
- TensorFlow 101 – Tensorflow tutorials;
- Introduction to Deep Learning for Image Recognition – this notebook accompanies the Introduction to Deep Learning for Image Recognition workshop to explain the core concepts of deep learning with emphasis on classifying images as the application;
The research made by Faculty of Applied Sciences at UCU. Link on main article.
- Python3: numpy, scikit-learn, pandas, scipy.
- Statistics (regression, properties of distributions, statistical tests, and proper usage, etc.) and probability theory.
- Deep learning frameworks: Tensorflow, PyTorch; MxNet, Caffe, Keras.
- Deep learning architectures: VGG, ResNet, Inception, MobileNet.
- Deepnets, hyperparameter optimization, visualization, interpretation.
- Machine learning models.
- Basic algorithms and common tasks
- Classical algorithms
- Computational complexity
- Useful Libraries and Frameworks
- CPU vs GPU parallelization
- Cloud and GPU Integration
- Data Visualization
- Vectors and Vectorization
- Image Processing
- Language Processing
- Common Notation and Core Ideas
- Linear Algebra
- N-dim Spaces
- Vectors, Matrices and Operators
- Mathematical and Function Analysis calculus
- Derivative and Partial derivative
- Chain Rule
- Probability theory
- Introduction to Statistics
- Price prediction Task
- Linear Regression
- Least square method
- Loss Function
- Optimization Task
- Gradient Descent
- MLE — Maximum Likelihood Estimation
- Data Preprocessing
- Model Visualization
- Data Normalization
- Polynomial Regression
- Multivariate Regression
- Basic idea of Computer Vision
- Classical Computer Vision
- Deep Learning and CV
- Core Idea of Semantic Gap
- Classification Task
- N-dim Spaces and Metrics
- Common datasets
- Mnist and Fashion-Mnist
- Cifar10 and Cifar100
- Cats vs Dogs
- ImageNet and MS COCO
- Euclidean Distance
- Nearest Neighbour
- Image Classification
- Cosine Similarity
- Manhattan distance
- KNN
- Train / Val / Test data split
- Logistic Regression
- Logistic Regression and Maximum Likelihood Estimation
- Loss function and Cross Entropy
- Accuracy and Metrics
- Precision, Recall and F1
- Rosenblatt’s Perceptron
- Artificial Neuron
- Warren McCulloch and Walter Pitts Neuron
- Fully Connected (Linear, Dense, Affine) Layer
- Activation Layers
- BackPropagation Algorithm
- Stochastic Gradient Descent
- Biological Neuron and Analogy
- Computational graphs
- Differentiable graphs
- Deep Learning Frameworks
- Custom Framework Realization
- Linear operations and Activation Realizations
- Main Blocks Of Deep Learning FrameWorks
- Custom Model and Train
- Optimizator realization
- TensorFlow
- Keras
- PyTorch
- Neural Networks Problems
- Activation Functions
- Weights Initialization
- Initialization Techniks
- Overfitting and Underfitting
- Regularization Methods
- L1 and L2 Regularization
- Ensemble of Models
- Dropout
- Hyper Parameters Search
- Optimizations behind SGD
- Momentum and Nesterov Momentum
- Adagrad, RMSprop
- Adam, Nadam
- Batch-Normalization
- Dimensionality reduction
- Feature Learning
- Vector Representation
- Embeddings
- Kernel Method
- Clusterization
- k-means Clusterization
- Hierarchical Clusterization
- Neural Networks and Unsupervised Learning
- Autoencoders
- Autoencoders architectures
- Tasks for Autoencoders
- Problem of Image Generation
- Image Denoising Task
- Problems of Fully Connected Neural Networks
- Towards Convolution Neural Network
- CNN as feature extractor
- Computer Vision tasks
- Transfer Learning
- Transfer Learning in Practice
- What Next (breath: CNN Architectures, Image Detection, Segmentation, GANs)
- Introduction to Natural Language Processing
- Text classification
- Words Preprocessing and Representation
- Part-of-Speech tagging (PoS tagging)
- Tokenization, Lemmatization and Stemming
- Bag of Words
- TF-IDF
- Distributive semantics
- Vector Semantics
- Term-document matrix
- Word context matrix
- Dense Vectors and Embeddings
- Word2Vec
- What Next (breath: RNN, Seq2Seq, Attention, Transformers, Modern Language Models)