Fall 2018

Day 22 = Day -8, Thursday, Nov 8

Supplementary reference: CS231n, a superb course from Stanford. Example lecture

the goal

a good starting point?

**Exercise:** Compute the partial derivatives of the softmax function, expressing your answers
in as simple a form as possible.

loss landscape

The negative of the gradient of the loss points in the direction of steepest decrease: step that way.

**Exercise:** Use Inkscape or other program to draw a path of
gradient descent from the tip of the arrow on
this topographic map

Upload your edited image to UBlearns when you are done.

computing the gradient

(to be continued, using *back-propagation* on the computational graph)

try to disfavor crazy weights that do well on training but not on anything else

polynomial-fitting analogy

I will write more.