CS 461: Machine Learning
Instructor: Kiri Wagstaff

Reading Questions for Lecture 6

Parametric Methods (Ch. 4.1-4.5)
  1. Why do we compute the log likelihood of the data instead of just computing the likelihood?
  2. What is the "parameter" required to specify a Bernoulli distribution?
  3. If I flip a coin and get { heads, tails, tails, heads, tails }, what is the maximum likelihood estimate of the probability of getting heads?
  4. What two parameters are needed to specify a Gaussian distribution?
  5. The Bayes Estimator is a weighted average of what two values?
  6. (Robert) Do you always want to select an unbiased estimator over a biased estimator?
  7. (Deidre) Can we use MAP or the Bayes Estimator as a tool for validation or verification?
  8. (Roice) How can the sample average "m" be considered an unbiased estimator of the mean "μ" unless it has an infinite number of samples?
  9. (Roice) Does "bias(d) = E[d(X)] - θ" have the same meaning as "E[m] = E[Σ xt/N] = μ"?