An introductory course for PyTorch.
Throughout this course we will be using:
- Python 3.6+.
- PyTorch 1.11.0
Lecture 0: Hello world, introduction to Jupyter, and PyTorch high-level overview
Lecture 1: Introduction to PyTorch: tensors, tensor operations, gradients, autodiff, and broadcasting
Lecture 2: Linear Regression via Gradient Descent using Numpy, Numpy + Autodiff, and PyTorch
Lecture 3: PyTorch nn.Modules
alongside training and evaluation loop
Lecture 4: Implementation of a proof-of-concept Word2Vec in PyTorch
⏳ Bonus: Comparison of the computation efficiency between raw Python, Numpy, and PyTorch (+JIT)
🔥 PyTorch Challenges: a set of 27 mini-puzzles (extension of the ones proposed by Sasha Rush)
🌎 From Puzzles to Real Code: Examples of broadcasting in real word applications: wordpieces aggregation, clustered attention, attention statistics.
First, clone this repository using git
:
git clone https://github.com/mtreviso/pytorch-lecture.git
cd pytorch-lecture
It is highly recommended that you work inside a Python virtualenv. You can create one and install all dependencies via:
python3 -m venv env
source env/bin/activate
pip3 install -r requirements.txt
Run Jupyter:
jupyter-notebook
After running the command above, your browser will automatically open the Jupyter homepage: http://localhost:8888/tree
.