MTH 448/563 Data-Oriented Computing

Fall 2018

Day 22 = Day -8, Thursday, Nov 8

The dark side

Chinese tech IDs people by how they walk

a neural network for object classification, cont'd

Supplementary reference: CS231n, a superb course from Stanford. Example lecture

class scores

the goal

weights

a good starting point?

a "loss" function

softmax

Exercise: Compute the partial derivatives of the softmax function, expressing your answers in as simple a form as possible.

cross entropy

total loss

minimizing loss

loss landscape

20170530_154521.jpg 20170530_154018.jpg

gradient descent

The negative of the gradient of the loss points in the direction of steepest decrease: step that way.

Exercise: Use Inkscape or other program to draw a path of gradient descent from the tip of the arrow on this topographic map

steepest_descent_exercise.png

Upload your edited image to UBlearns when you are done.

computing the gradient

(to be continued, using back-propagation on the computational graph)

regularization

try to disfavor crazy weights that do well on training but not on anything else

polynomial-fitting analogy

LeafSnap quiz

policeman_giving_directions.png

I will write more.