S-114.1310 Introduction to Modeling and Information Theory
Fall 2007 (3 p)
The objective of this course is to give understanding about the basic priciples of modeling, emphasizing on statistical modeling. Also information theory and its applications in statistical modeling are discussed. Practical solutions are given to model class selection, model complexity and parameter selection.
Schedule for the home assignment:
List of contents:
Description of course contents and some practical issues. What modeling is and some basic principles. Basic probability theory.
Introduction to basics of information theory. Basics of coding theory.
Coding continued. Parameter estimation. Properties of estimators. Maximum likelihood and method of least squares.
Estimation theory continues: Cramer-Rao inequality and Fisher information. Bayesian inference: MAP estimation. Linear models.
Model complexity and model selection. Cross-validation.
Bootstrap. Akaike's information criterion.
Minimum Description Length principle and stochastic complexity.
Lecture slides and exercise problems are the necessary and sufficient material for the examination. Additional course material may be helpful, but the examination will be based on slides and exercises.
Include your name, stundent number and email address in the assignment report. Follow the home assignment instructions on how to return the report (answers to exercises, deadlines, file formats, etc.). You can also do the assignment in pairs. In this case the pair returns a single report.
The home assignment is graded pass/fail.
|Home assignment instructions|
|Data sets||Exercise 1||Exercise 2|
All lectures in one file:
|Lecture slides (8 slides per page, PDF)||Lecture slides (8 slides per page, PS)|
|Lecture slides (4 slides per page, PDF)|
|Lecture slides (2 slides per page, PDF)|
Individual lectures (PostScript, 8 slides per page):
|1.11.||Lecture 1||Lecture 2|
|6.11.||Lecture 3||Lecture 4|
|13.11.||Lecture 6||Lecture 7|
|20.11.||Lecture 9||Lecture 10|
|27.11.||Lecture 12||Lecture 13|
|04.12.||Lecture 14||Lecture 15|
|11.12.||Lecture 16||Lecture 17|
|8.11.||Exercise 1 (PDF)||Solutions 1 (PDF)|
|15.11.||Exercise 2 (PDF)||Solutions 2 (PDF)|
|22.11.||Exercise 3 (PDF)||Solutions 3 (PDF)|
|29.11.||Exercise 4 (PDF)||Solutions 4 (PDF)|
|29.11.||Exercise 5 (PDF)||Solutions 5 (PDF)|
|13.12.||Exercise 6 (PDF)||Solutions 6 (PDF)|
A book on information theory by David MacKay (free electronic use) can be found in http://www.inference.phy.cam.ac.uk/mackay/itila/book.html . For the purposes of this course, MacKay's book contain useful material on information theory, coding and Bayesian methods.
Lecture notes from W.D. Penny's signal processing course can be found in http://www.fil.ion.ucl.ac.uk/~wpenny/course/course.html . Chapters 1, 2.1-2.5, 3 (not 3.3), 4 ja 10.1-10.3 are useful also in this course.
An introduction into modern MDL theory can be found in the book Minimum Description Length - Theory and Applications by Peter Grünwald. The first two introductory chapters are available on his homepage http://homepages.cwi.nl/~pdg/publicationpage.html by the name A Tutorial Introduction to the Minimum Description Principle.
More detailed material and background information; not required in the examination.
Basics of probability and statistics:
Minimum Description Lenght principle:
Questions and comments:
M.Sc. Janne Ojanen
Phone: (09) 451 4837