Under "Lectures" you find material actually discussed or planned
for a following week.
Under "Exercises" you find recommended exercises judged to be relevant in view of material discussed or planned for that week. In principle I am willing to comment on any exercise dealt with by the participants. Please send me either ps- or pdf-files.
The plan is tentative and will be adjusted on a running basis. Always consult the lecture notes, beginnings here. WELL, as I have told you, due to extra duties I cannot as planned write lecture notes as detailed and complete as I had wanted to but will to a great extent have to point to various sources where you find material for reading.
|week (showing date of lecture)||Lectures||Relevant exercises|
|August 31|| I am away and the course will be opened
by a lecture of Peter Harremoes which is intended as an "invitation to
information theory". The lecture will not be technical but rather
paint a broad picture by indicating types of problems dealt with and
typical types of applications. It is to be expected that some of the subjects
pointed to will only be discussed later in a rudimentary way or even not atall. Thus some topics indicated may point to possible projects ("fagprojekter") or to items taken up in any eventual later course.
* Peter commented on work of Shannon (2nd. main theorem and ...), on source coding, on cryptography, rate distortion, multi-user problems, thermodynamical equilibrium (role of temperature, free energy, ...). Quantities emphasized: Entropy, divergence, mutual information.
|none - or have a look at exercises for the coming weeks|
|Sept. 7||Contrary to what was announced previously, I have decided to start
the formal lectures by discussing a quite general setting for certain
games where "the observer" (statistician, physicist, investment
planner or what the case may be) fights against another player,
conceived as "nature", "GOD", "the system" or some similar personification.
This points to interesting conceptual and philosophical problems and, at the
same time, points to some of the major applications we wish to take up later.
I will write notes to this part of the course (available possibly only
after the lecture)
* Plan followed quite closely. Notes now available (partial). This ms. up to (i) of Theorem 1 covers the main technical stuff I covered (in the lecture notes this is written up at a more leisurely pace). This ms. may also be helpful.
|At the end of lecture I presented rather informally some exercises. Formal versions now (28/9) included in lecture notes.|
|Sept. 14||In the first hour, the game theoretical approach was discussed further and the main result for absolute games proved (not yet in lecture notes). The second hour discussed codes and the descriptive approach to the entropy function (not yet in lecture notes). Kraft's inequality was proved. This stuff is covered also in my book "Informationsteori".||see above|
|Sept. 21||We considered optimal binary codes and developed the Huffman algorithm. Further, key preparations for final derivation of formula for entropy derived (e.g. the log-sum inequality). see e.g. first two sections here (ps) or the same (pdf)||For this and the following few weeks the 20 exercises on codes to be found in Section 1 of this set of exercises (ps), or the same (pdf) are relevant. To start with you may look at Exercises 1+2+4. Some exercises are more exciting (and demanding). E.g. Exercise 3 is a theorem of McMillan, Ex. 12 shows how Fibonacci numbers may enter , Ex. 13+14 deal with Elias-Levenhstein codes of the natural numbers and Ex. 16 is related to the celebrated Huffman algorithm.|
|Sept. 28||The derivation of the first main theorem of information theory completed. Entropy and divergence defined. Topological notions re convergence of distributions studied, e.g. Section 3 of the "toolkit-notes" covered as well as a discussion of Kisisynski's theorem (I will not have notes to that, but if you are interested, ask me - well, thinking about it, I could write an exercise about it ... later).||no need right now to point to more exercises|
|Oct. 5||Datareduction and Pinsker's inequality discussed (Section 4 in "toolkit"). Also considered example with strange behaviour of divergence (finite from P to Q and from Q to R, but infinite from P to R, cf. exercise 29). Note re Pinsker's inequality: The hint in "toolkit", p.13, just before section 5 is too complicated: Enough to differentiate once and show non-positivity of diff.quotient in the range [0,p].||The exercises 21,22,24,28,29,32 are recommended (for evaluation purposes enough that you do 1-2 of these).|
|Oct. 19||After the potato holiday we resumed the lectures. Focus on the main information theoretical quantities. We covered up to Theorem 6.2 in the toolkit ms., but did not go into technical details discussed in Section 5. Apart from the basic text, I commented on somne new results on socalled Jensen-Shannon divergence. This material I shall probably not discuss in any detail, but interested persons may find more in this ms.||no need to point to new exercises yet|
|Oct. 26||Continued with the key concepts of information theory, following the toolkit ms., leaving aside the rest of Section 6. Most of Section 7+8 covered. NOTE regarding mutual information: The standard notation is now I(X;Y) which I ask you to use in exercises etc. Started on Shannons result which says, loosely speaking, that safe encryption requires a key which is at least as complex as the message you want to communicate! followed this ms., sections 2+3 (my 2002 inf.th. lecture notes)||no need to point to new exercises yet|
|Nov.2||Left-overs from last week were taken up. Continued with the toolkit ms., section 9.||New exercises only after next week, I think! Regarding old exercises, I particularly recommend no. 21, 22 and 24. Please hand in "soon"!.|
|Nov. 9||Proved Kuhn-Tucker theorem regarding capacity of simple channels, following my 2002 lexture notes, section 7. Went through simple examples. Then turned to material in "Block Symmetry in Discrete Memoryless Channels", in particular introduced benefit for channels and proved the K-T theorem in this more generalized form (Theorem 1) (rather indicated why it is trivial applying same technique) and discussed and proved Theorem 2, part (i). Further talked about the material in Section IV of the "block symmetry-manuscript". The manuscript is written in a rather condensed style.||actually, I may post some new exercises related to channels and determination of capacity before next Tuesday|
|Nov. 16||I talked briefly about new exercises, especially related to the "capacity/redundancy theorem". Then turned to the maximum entropy principle where I followed rather closely this ms: "Maximum Entropy Fundamentals", Sections 1,3 and 4. The theorems in Section 4 are of key importance for the applications.||New exercises, see here|
|Nov. 23||I continued with the manuscript "Maximum Entropy Fundamentals" and covered Sections 2 and 5 (those parts we do not already know) and Theorem 6.1, except the proof of (ii) implies (i). I also spent some time explaining about topological details (lsc, compactness of set of generalized codes etc.).||New exercises updated and some added|
|Nov. 30||I continued with the "MaxEnt Fundamentals" manuscript and covered remaining parts of Section 6 but only until Corollary 6.6 (in case you look into that anyhow, the statement in final part of Cor.6.6 should only state convergence in total variation). The constant 1/8 on page 208 should be 1/16. In the 4th line of the proof of Cor. 6.4, P should be P^*. The main proofs can be improved a bit, e.g. by noting early main properties of the space of codes (e.g. adding that the set of compact codes is the set of extremal points). I started with a few comments on Section 7.||Perhaps MaxEnt-type exercises will be added to the new exercises.|
|Dec. 7||I continued with maximum entropy stuff, but most of it (hyperbolic distributions, Zipf's law) only by stating key results. Started on the IC/H-diagram, aiming at discussing some models of prediction and universal coding. See details in this ms. (misprints: p.3,line8, - ln x should be - x ln x; on p. 4, line 6, the denominator should be (1-x)(k-x) and finally, the figure on p.5-6 is missing - but it ought to be pretty clear what is intended!).||New exercise, no. 7, added. (had difficulties copying to the right file, hope I succeed!)|
|Dec. 14||I started by going through the Kuhn-Tucker criterion as in this ms.. I then went on to discuss models defined via a partial ordering and continued with discussion of Bernouilli models following the ms. Time was too short to go through the lemma of replacement and its application to determine the form of the IC/H-diagram. The form of the diagram was taken for granted in discussing optimal universal prediction.||I added new exercise.|
|Dec. 21||Sebastian Bender will give a seminar on optimization problems in models used in geography. Previously, maximum entropy maximization was often used. Nowadays other methods are also or even predominently in use.||-|