Hide menu

TSKS12 Modern Channel Coding, Inference and Learning

Course plan for period HT1, 2017

1. Exposition

The course consists of lectures, exercise sessions and laborations. The teory is presented on the lectures. Some tutorial examples are considered at the exercise sessions. Two computer aided laboratons provide oportunity to implement and understand some of the algorithms described in the lectures.

2. Examination

The examination is a written exam. Date and time for the comming exams kan be found here .

The exam consists of five problems which are worth 3 points each. In order to pass the exam 7 points are required.The only helping aids which are allowed at the exams are paper, pen and eraser.

If the number of the students registered for the examination is low (less than 3), the written exam can be substituted by an oral one.

3. Course leaders

The lectures are given by Danyo Danev, universitetslektor, tel.: 013-28 13 35, e-mail: danyo.danev@liu.se.

Exercise sessions and laborations are supervised by Prabhu Chandhar, tel.: 013-28 13 13, e-mail: prabhu.c@liu.se.

4. Course literature

The course is based on the book
[1]   David J.C. MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press 2003, ISBN-13: 9780521642989 | ISBN-10: 0521642981.
Complementary literature
[2]   Shu Lin and Daniel J. Costello, Jr., Error Control Coding , Pearson / Prentice Hall 2004
[3]   Stephen Wicker, Error Control Systems for Digital Communication and Storage , Prentice Hall 1995
[4]   Peter Sweeney, Error Control Coding: From Theory to Practice , Wiley 2002
[5]   Salvatore Gravano, Introduction to Error Control Codes , Oxford 2001
[6]   David Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press 2012

5. Course plan

(references in the MacKay's book)

L1

Information Transmission, Probability, Entropy, Mutual Information -- Ch 1 (1), Ch 2 (1,2,4,5,7), Ch 8 (1,2), Ch 9 (1-5).

L2

The Noisy Channel Coding Theorem -- Ch 8 (3), Ch 9 (6-7), Ch 10 (1-5).

L3

Computing Channel Capacity, The Gaussian Channel -- Ch 10 (6-8), Ch 11 (1,3-4).

L4

Practical Channel Coding, Repeat-Accumulate Codes -- Ch 11 (5-8), Ch 49.

L5

Digital Fountain Codes, Clustering -- Ch 50, Ch 20.

L6

Maximulm Likelihood, Exact Marginalization -- Ch 21, Ch 22, Ch 24.

L7

Exact Marginalization in Trellises - the Viterbi and BCJR Algorithms -- Ch 25.

L8

Exact Marginalization in Graphs, LDPC Codes - the Sum-Product Algorithm -- Ch 26, Ch 47.

L9

"Turbo" Codes, Monte Carlo Methods -- Ch 48, Ch 29.

L10

Supervised Learning, Capacity of a Neuron -- Ch 39, Ch 40, Ch 41.

6. Tutorial sessions' plan

(references in the MacKay's book)

TS1

Ch 2,8,9

2.27, 2.29, 2.36, 2.37, 8.1, 8.6, 9.7, 9.12, 9.17, 9.19, 9.21

TS2

Ch 10,11,50

10.2, 10.3, 10.7, 10.10, 11.1, 11.2, 11.4, 11.5, 50.2

TS3

Ch 20,21,22,25

20.2, 20.3, 21.1, 21.2, 22.8, 22.9, 22.10, 22.14, 25.1, 25.4

TS4

Ch 25,26

25.2, 25.3, 25.5, 25.7, 25.9, 26.1, 26.2

TS5

Ch 47,48

TS6

Ch 29, 39

29.1, 29.3, 29.4, 29.5, 29.7, 29.10, 29.12, 29.14, 39.2,39.3, 39.5, 39.6

7. Laborations

Detailed information of the tasks for each of the two laboratory sessions can be downloaded from the following links.

LS1

Sep, 28th, 13-17 at EGYP

Description Lab1

LS2

Oct, 12th, 17-21 at SOUT

Description Lab2


Page responsible: Mikael Olofsson
Last updated: 2017 08 15   15:59