-
Notifications
You must be signed in to change notification settings - Fork 107
/
Acknowledgements.tex
40 lines (20 loc) · 2.34 KB
/
Acknowledgements.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
\chapter{Acknowledgements}
\yinipar{\fontsize{60pt}{72pt}\usefont{U}{Kramer}{xl}{n}T}his work has no benefit nor added value to the deep learning topic on its own. It is just the reformulation of ideas of brighter researchers to fit a peculiar mindset: the one of preferring formulas with ten indices but where one knows precisely what one is manipulating rather than (in my opinion sometimes opaque) matrix formulations where the dimension of the objects are rarely if ever specified.
\vspace{0.2cm}
Among the brighter people from whom I learned online are Andrew Ng. His Coursera class (\href{https://www.coursera.org/learn/machine-learning}{here}) was the first contact I got with Neural Network, and this pedagogical introduction allowed me to build on solid ground.
\vspace{0.2cm}
I also wish to particularly thanks Hugo Larochelle, who not only built a wonderful deep learning class (\href{http://info.usherbrooke.ca/hlarochelle/neural_networks/content.html}{here}), but was also kind enough to answer emails from a complete beginner and stranger!
\vspace{0.2cm}
The Stanford class on convolutional networks (\href{http://cs231n.github.io/convolutional-networks/}{here}) proved extremely valuable to me, so did the one on Natural Language processing (\href{http://web.stanford.edu/class/cs224n/}{here}).
\vspace{0.2cm}
I also benefited greatly from Sebastian Ruder's blog (\href{http://ruder.io/#open}{here}), both from the blog pages on gradient descent optimization techniques and from the author himself.
\vspace{0.2cm}
I learned more about LSTM on colah's blog (\href{http://colah.github.io/posts/2015-08-Understanding-LSTMs/}{here}), and some of my drawings are inspired from there.
\vspace{0.2cm}
I also thank Jonathan Del Hoyo for the great articles that he regularly shares on LinkedIn.
\vspace{0.2cm}
Many thanks go to my collaborators at Mediamobile, who let me dig as deep as I wanted on Neural Networks. I am especially indebted to Clément, Nicolas, Jessica, Christine and Céline.
\vspace{0.2cm}
Thanks to Jean-Michel Loubes and Fabrice Gamboa, from whom I learned a great deal on probability theory and statistics.
\vspace{0.2cm}
I end this list with my employer, Mediamobile, which has been kind enough to let me work on this topic with complete freedom. A special thanks to Philippe, who supervised me with the perfect balance of feedback and freedom!