EE 381K Information Theory - Fall 2022
- Meets TuTh 9:30-11, ECJ 1.312
- Unique No: 16700
Instructor
- Gustavo de Veciana
- Office: EER 6.874
- Office Hours: TuWTh 11-12
- Email: gustavo@ece.utexas.edu
- WWW: http://ece.utexas.edu/~gustavo
Description
This course is intended for graduate students with a
background in probability and ideally some background/interest in
communications, systems, signal processing and/or data sciences.
Course Topics (Tentative list)
-
Entropy, Relative Entropy and Mutual Information:
Definitions: chain rules for entropy, Jensen's inequality and its implications, Data Processing Inequality, Thermodynamics, Fano's inequality.
-
The Asymptotic Equipartition Property:
AEP, Consequences of AEP and data compression, high probablity sets and typical set
-
Entropy Rates of stochastic processes
Markov chains, entropy rate, Hidden Markov Models.
-
Data Compression :
Codes, Kraft inequality, Huffman codes, Arithmetic codes,
-
Gambling and data compression
Gambling with side information, dependent horse races , data compression and gambling, entropy of english
-
Kolmogorov Complexity:
Models of computation, Kolmogorov complexity, The halting problem and non-computability of Kolmogorov complexity
Occams Razor..
-
Channel Capacity:
Symmetric channel, Jointly typical sequences, Channel Coding Theorem,
Joint source channel coding theorem
-
Differential Entropy:
AEP for continuous RVs, differential entropy vs discrte entropy, Joint and conditional differential etnropy,
relative entropy and mutual information
-
The Gaussian Channel:
The Gaussian channel, parallel Gaussian Channels, Channels with colored Gaussian Noise, Gaussian Channels with feedback
-
Maximum Entropy and Spectral Estimation:
Maximum entropy distribution, spectrum estimation, Burg's maxmum entropy Theorem
-
Information theory and statistics :
Method of types, LLN, Large deviation theory, Sanov's theorem, Hypothesis testing, Lempel-Ziv Coding
-
Rate distortion theory:
Quantization, Rate distortion function, Rate Distortion Theorem,
-
Network Information theory:
Medium Access channel, Broadcast channel, source coding with side information, rate distrotion with side information
-
Assorted topics we would like to touch on (perhaps through course projects)
-
Machine Learning and Information Theory
-
Quantum Information Theory
-
Image and video compression and quality assessment
Prerequisites
A prerequisite
for this course is a graduate course in Probability
and Stochastic Processes , such as EE381J.
Required text
Elements of Information Theory ,
by Cover and Thomas. Wiley, 2006. Second Edition
Format/Evaluation
Homework will be assigned weekly and will be due at the beginning of
the last class in the following week. They will be graded on a {-, ok,
+ } basis, you will get solutions, and they will be worth a total of 30
pts. Yo may work in groups of 2-3, if so please turn in only 1 hwk paper with
the group's name. There will be two in-class midterms worth 25 pts each. No final exam,
but a (20 min) presentation worth 20 pts on a topic of your choice (subject
to my approval) - attendance is mandatory.
Final Exam:
No final, however you will be required to select a topic (possibly a
research topic) and make a presentation to the class. Your time may be prior to or during
the scheduled final exams time - participation during your colleagues presentations mandatory.
Where does course this fit in?
Prior to taking this course, you might consider taking: Probability
and Stochastic Processes I this is a foundations graduate course and
perhaps Digital Communications..
Some related courses are, Wireless Communications; Digital Signal
Processing. Advanced Signal Processing.
Note: All departmental, college and university regulations
concerning drops will be followed. The University of Texas at Austin provides
upon request appropriate academic adjustments for qualified students with
disabilities. For more information, contact the Office of the Dean of Students
at 471-6259, 471-4241 TDD.