Lec_6_Training Neural Networks, Part II

Happy Moment: 功夫老鼠—看我的锁喉功

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_ci

Parameter Updates

The common method

Update problem-1 TOO SLOW

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_lua_02

Momentum update

  • Physical interpretation as ball rolling down the loss function + friction (mu coefficient).
  • mu = usually ~0.5, 0.9, or 0.99 (Sometimes annealed over time, e.g. from 0.5 -> 0.99)

Nesterov Momentum update

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_ci_03

AdaGrad update

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_ci_04

RMSProp update

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_lua_05


stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_lua_06

Adam update

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_lua_07

Update problem-2 HYPERPARAMETER NEEDED

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_过拟合_08

Second order optimization methods

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_lua_09

L-BFGS

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_lua_10

Summary for update problems 1+2

IN PRACTICE
- Adam is a good default choice in most cases
- If you can afford to do full batch updates then try out L-BFGS (and don’t forget to disable all sources of noise)

Evaluation: Model Ensembles

  1. Train multiple independent models
  2. At test time average their results
  3. Enjoy 2% extra performance

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_过拟合_11

Regularization(dropout)

Regularization: Dropout “randomly set some neurons to zero in the forward pass”
Dropout 的目的是防止过拟合
-

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_lua_12


-

stanford_CS231n_learning note_Lec_06 Training Neural Networks, Part2_lua_13

Gradient Checking

  • see notes

Convolutional Neural Networks

A bit of history:

Tomorrow is Weekend, Have a Good Relax!