Reading Questions for Lecture 6
Parametric Methods (Ch. 4.1-4.5)
- Why do we compute the log likelihood of the data instead of just computing the likelihood?
 - What is the "parameter" required to specify a Bernoulli distribution?
 - If I flip a coin and get { heads, tails, tails, heads, tails }, what is the maximum likelihood estimate of the probability of getting heads?
 - What two parameters are needed to specify a Gaussian distribution?
 - The Bayes Estimator is a weighted average of what two values?
 - (Robert) Do you always want to select an unbiased estimator over a biased estimator?
 - (Deidre) Can we use MAP or the Bayes Estimator as a tool for validation or verification?
 - (Roice) How can the sample average "m" be considered an unbiased estimator of the mean "μ" unless it has an infinite number of samples?
 - (Roice) Does "bias(d) = E[d(X)] - θ" have the same meaning as "E[m] = E[Σ xt/N] = μ"?