There a re a few minor errors in the material for the Stanford Machine Learning Class (ml-class.com)...
Homework 1 (ex1.pdf), Section 2.2.1 -- Update Equations, p. 6, formula at top of page
IS: summation parentheses exclude x_j(i)
SHOULD BE: the summation symbol should have parentheses that include the x_j(i) term in the sum, though the (i) superscript index makes it clear that this is the intent
Homework 1 (ex1.pdf), Section 3.1 -- Feature Normalization, p. 10, second bullet at bottom of page
IS: scale (divide) the feature values by the inverse of their respective "standard deviations."
SHOULD BE: scale (multiply) the feature values by the inverse of their respective "standard deviations."
Class Material Lecture 4 (Lecture4.pdf) -- p. 16/31, right hand side
and
Video 04.4 Linear Regression With Multiple Variables -- Gradient Descent In Practice II, Learning Rate.mp4, 05:00/08:58
IS: The axes of the plot are mislabeled. Dr. Ng is plotting in pink and talking about the relationship between J(theta) and the parameters x, *not*, the number of iterations.
SHOULD BE: To be consistent Dr. Ng should probably show the plot J vs iterations so that we can recognize the signature if we ever see it in our debugging. He could overlay the plot of J vs x to explain the source of the problem (alpha too big) too.
No comments:
Post a Comment