CS 461: Machine Learning
Instructor: Kiri Wagstaff

Reading Questions for Lecture 6

Parametric Methods (Ch. 4.1-4.5)
  1. Why do we compute the log likelihood of the data instead of just computing the likelihood?
  2. What is the "parameter" required to specify a Bernoulli distribution?
  3. If I flip a coin and get { heads, tails, tails, heads, tails }, what is the maximum likelihood estimate of the probability of getting heads?
  4. What two parameters are needed to specify a Gaussian distribution?
  5. The Bayes Estimator is a weighted average of what two values?