|
Course Catalog 2010-2011
MAT-51706 Bayesian Methods, 6 cr |
Person responsible
Robert Piche
Lessons
Study type | P1 | P2 | P3 | P4 | Summer | Implementations | Lecture times and places |
|
|
|
|
|
|
|
Requirements
Exam, or weekly exercises + exam.
Completion parts must belong to the same implementation
Principles and baselines related to teaching and learning
-
Learning outcomes
Bayesian statistics is an alternative approach to classical statistics. It is a coherent data analysis methodology that is based on the systematic application of the laws of probability. Since the discovery of powerful computer algorithms, starting in the 1990's, Bayesian methods are now widely used in all areas of science and engineering, including machine learning, medical imaging, and data compression. After studying this course, the student is able to formulate statistical models for inference, hypothesis testing, model comparison, and decisions. Given a data set, he/she can apply formulas for the solution of simple standard models, and can write WinBUGS programs and MCMC algorithms for the solution of more realistic models. The prerequisite for the course is knowledge of elementary probability theory: probability calculus; pdf's and pmf's; mean and variance; change of variable; the normal, binomial, Poisson, and exponential distributions. Knowledge of "classical" statistics is not needed. Teaching is in English. Exam questions are in both Finnish and English, and may be answered in either language.
Content
Content | Core content | Complementary knowledge | Specialist knowledge |
1. | Formulation of standard parameter-inference problems: finding the mean of real data (normal model), the proportion of count data (binomial model, multinomial model), the rate of occurrences (Poisson model), or the lifetime (exponential model). Derivation of the exact solution (i.e. the parameters' posterior distribution and data's posterior predictive distribution) using conjugate priors. Writing of WinBUGS and DoodleBUGS programs to compute the solution numerically. Computation of summary statistics (mean, mode, median, credibility intervals) and one-sided hypotheses | Recursive use of Bayes' rule; prior predictive distribution; beta, gamma, inverse-gamma, Student-t, and Dirichlet distributions; Poisson model for occurrences in intervals of different sizes; coping with censored lifetime data; derivation of formula for mean and variance for beta and gamma distributions | Proofs of probability calculus theorems from axioms; eliciting subjective probability from betting odds; negative binomial and beta-binomial distributions; derivation of Poisson density; hazard and reliability functions; derivation of posterior marginal distribution for two-parameter normal model |
2. | Formulation of more complex inference problems: gaussian mixture for coping with outliers; comparing means; hierarchical model for comparing groups; fitting a regression curve; fitting an autoregressive model; detecting a change in rate. Writing of WinBUGS programs and Gibbs sampler algorithms for their solution | comparing means in models with paired observations; Markov chains and stationary distribution | the joint density corresponding to a directed acyclic graph (DAG); proof that Gibbs sampler's stationary distribution is the posterior |
3. | Bayesian theory: Jeffreys' prior; stopping rule principle; Laplace's method for approximation of posterior; model comparison via Bayes factor and via the Deviance Information Criterion (DIC) | consistency of Jeffreys' prior with change of variables; the likelihood principle; applications of exact marginalisation: detecting a change point, finding the frequency of a periodic signal, finding the autoregression parameter, choosing the regularisation parameter | ancillary data, sufficient statistics |
4. | Finding the decision that minimizes the expected loss and finding the Bayesian decision function for problems with a finite number of discrete variables and options | prior value of data; signal detection; decision-theoretical interpretation of mean, mode, and median | derivation of mean, mode, and median |
Evaluation criteria for the course
The course grade is based on a three-hour open-book exam written in a PC-lab that allows access to WinBUGS. Bonus points (up to 20%) can by earned by presenting solutions to the weekly homework problems.
Assessment scale:
Numerical evaluation scale (1-5) will be used on the course
Partial passing:
Study material
Type | Name | Author | ISBN | URL | Edition, availability, ... | Examination material | Language |
Other online content | Robert Piche | Video-lectures (flash), course notes, exercises (pdf) | English |
Prerequisites
Course | Mandatory/Advisable | Description |
MAT-20501 Todennäköisyyslaskenta | Mandatory |
Prerequisite relations (Requires logging in to POP)
Correspondence of content
Course | Corresponds course | Description |
|
|
Additional information
Suitable for postgraduate studies
More precise information per implementation
Implementation | Description | Methods of instruction | Implementation |
Recorded lectures are available for self-study at course home page. Exercises are Tuesdays 10-12 in room Td308 during periods 1-2 in 2010. First exercise is Sep 7. |