-
Notifications
You must be signed in to change notification settings - Fork 334
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[id] cs-229-deep-learning #154
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Dear Bang Pras.
Ini review saya, Bang. Mohon maaf kemarin belum di-request changes soalnya masih belum paham soal aplikasi ini. Semoga membantu.
Modify the translation according to the reviews
Mas @GunawanTri bisa cek lagi, saya sudah update |
id/cheatsheet-deep-learning.md
Outdated
|
||
**44. 1) We initialize the value:** | ||
|
||
⟶Kita menginialisasi value |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Kita menginialisasi value:
id/cheatsheet-deep-learning.md
Outdated
|
||
**19. Dropout ― Dropout is a technique meant at preventing overfitting the training data by dropping out units in a neural network. In practice, neurons are either dropped with probability p or kept with probability 1−p** | ||
|
||
⟶Dropout - Dropout adalah sebuah teknik yang digunakan untuk mencegah overfitting pada saraf tiruan dengan men-drop out unit pada sebuah neural network. Pada pengaplikasiannya, neuron di drop dengan probabilitas p atau dipertahankan dengan probabilitas 1-p |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Droput - Dropout adalah sebuah teknik yang digunakan untuk mencegah overfitting pada saraf tiruan dengan memutus unit yang terdapat pada sebuah neural network. Dalam praktiknya, neuron dikurangi dengan probabilitas p atau dipertahankan dengan probabilitas 1-p
|
||
**22. Batch normalization ― It is a step of hyperparameter γ,β that normalizes the batch {xi}. By noting μB,σ2B the mean and variance of that we want to correct to the batch, it is done as follows:** | ||
|
||
⟶ Normalisasi batch - Normalisasi batch adalah sebuah langkah untuk menormalisasi batch {xi}. Dengan mendefinisikan μB,σ2B sebagai nilai rata-rata dan variansi dari batch yang ingin kita normalisasi, hal tersebut dapat dilakukan dengan cara: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Normalisasi batch - Normalisasi batch adalah sebuah langkah dari hiperparameter γ,β untuk menormalisasi batch {xi}. Dengan mendefinisikan μB,σ2B sebagai nilai rata-rata dan variansi dari batch yang ingin kita normalisasi, hal tersebut dapat dilakukan dengan cara:
id/cheatsheet-deep-learning.md
Outdated
|
||
**23. It is usually done after a fully connected/convolutional layer and before a non-linearity layer and aims at allowing higher learning rates and reducing the strong dependence on initialization.** | ||
|
||
⟶Batch normalisasi biasa ditempatkan setelah sebuah layer fully-connected atau convolutional dan sebelum sebuah non-linear layer yang bertujun untuk memungkinkannya penggunaan nilai learning rate yang lebih tinggi dan mengurangi ketergantungan model pada nilai inisialisasi parameter. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Normalisasi batch biasa ditempatkan setelah sebuah layer yang sepenuhnya terhubung/konvolusi dan sebelum sebuah layer non-linear yang bertujuan untuk memungkinkannya penggunaan nilai learning rate yang lebih tinggi dan mengurangi ketergantungan kuat pada nilai inisialisasi parameter.
id/cheatsheet-deep-learning.md
Outdated
|
||
**24. Recurrent Neural Networks** | ||
|
||
⟶Recurrent Neural Networks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Recurrent Neural Networks
id/cheatsheet-deep-learning.md
Outdated
|
||
**20. Convolutional Neural Networks** | ||
|
||
⟶Convolutional Neural Network |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Convolutional Neural Network
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Gun ini udah gw perbaikin, bisa kita close?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bisa Bang kalo udah gak ada tambahan/perbaikan lagi.
Thank you @gitarja and @GunawanTri for all your work! @gitarja: it seems there are a few other unresolved discussions left. Please feel free to let me know whenever you are ready for the merge! |
No description provided.