MATH - MTHE 474/874 - Information Theory
Office: Jeffery Hall, Room 402
Slot 21: Monday 11:30, Tuesday 1:30, Thursday 12:30 -- Jeffery 234.
- The homework assignments will
be posted on this web site (click here to access
There will be a total of five problem sets.
Solutions to the assignments will be available on reserve
at the Circulation Desk of Stauffer Library.
Thursday: 1:30 - 2:30, or by appointment.
Textbook and Notes
T.M. Cover and J.A. Thomas, Elements of Information Theory, Second Edition, John Wiley, 2006.
R. Gallager, Information Theory and Reliable Communication,
John Wiley, 1968.
R. Blahut, Principles and Practice of Information Theory,
Addison Wesley, 1987.
R. Yeung, Information Theory and Network Coding,
Midterm Exam: 30%
Final Exam: 60%
Note: Undergraduate students enrolled
in MATH-474 will receive
a reduced load in the homeworks and exams.
The Midterm Exam is
scheduled for Thursday, November 9, 2017.
All assignments and exams in this course will receive numerical marks. The final
grade students receive for the course will be derived by converting their numerical
course average to a letter grade according to the
Queens Official Grade Conversion Scale.
Policy for Missing Exams:
There will be no
makeup exams. If a student misses the midterm
due to severe illness or a personal tragedy, then
the final exam will count towards 90% of the
Students with Special Needs:
Students with diverse learning styles and needs are welcome at Queen's. In
particular, if you have a disability or health consideration that may require
accommodations, please feel free to approach me and/or
Queen's Student Accessibility Services.
Academic integrity is constituted by the five core fundamental values of honesty,
trust, fairness, respect and responsibility.
These values are central to the building, nurturing and sustaining of an academic
community in which all members of the community will thrive. Adherence to the values
expressed through academic integrity forms a foundation for the "freedom of inquiry
and exchange of ideas" essential to the intellectual life of the University.
Students are responsible for familiarizing themselves with the regulations concerning
academic integrity and for ensuring that their assignments conform to the principles
of academic integrity. Information on academic integrity is available on these websites:
Arts and Science
Engineering and Applied Science.
Departures from academic integrity include plagiarism, use of unauthorized materials,
facilitation, forgery and falsification, and are antithetical to the development of an
academic community at Queen's. Given the seriousness of these matters, actions which
contravene the regulation on academic integrity carry sanctions that can range from
a warning or the loss of grades on an assignment to the failure of a course to
a requirement to withdraw from the university.
The reliable transmission of information bearing signals over a noisy
communication channel is at the heart of what we call communication.
Information theory -- founded by
Claude E. Shannon in 1948 --
provides a mathematical framework for the theory
of communication; it describes the fundamental
limits to how efficiently one can encode
information and still be able to recover it with negligible loss.
This course will examine the basic concepts of this theory.
What follows is a list of topics to be covered.
- Shannon's Measures of Information: entropy, divergence,
mutual information; properties of information measures;
the data processing theorem; Fano's inequality.
- Fundamentals of Fixed-Length Lossless Source Coding
discrete memoryless sources,
asymptotic equipartition property (AEP),
block or fixed-length coding,
fixed-length source coding theorem for discrete memoryless sources;
entropy rate of stationary sources with memory, Markov sources,
stationary ergodic sources, fixed-length source coding theorem for
stationary ergodic sources; source modeling and redundancy.
- Fundamentals of Variable-Length Lossless Source Coding:
variable-length encoding, unique decodability, Kraft inequality,
prefix codes, variable-length source coding theorem
for discrete memoryless sources and for stationary sources
with memory; Shannon-Fano code; design and construction of optimal variable-length
codes: Huffman codes.
- Fundamentals of Channel Coding:
discrete memoryless channels,
channel capacity and properties; noisy channel coding theorem
for discrete memoryless channels;
the lossless joint source-channel coding theorem and Shannon's separation principle.
- Information Theory for Continuous Alphabet Systems:
differential entropy, divergence and mutual information;
differential entropy of the
multivariate Gaussian distribution; AEP for continuous alphabet
memoryless sources, capacity
of discrete-time and band-limited continuous-time memoryless Gaussian channels;
parallel Gaussian channels and waterfilling.
- Rate-Distortion Theory:
(time permitting) lossy data compression;
discrete memoryless sources,
rate-distortion function and its properties; rate-distortion theorem;
the lossy joint source-channel coding theorem and the Shannon limit of communication systems.
Claude E. Shannon
IEEE Information Theory Society
Canadian Society of Information Theory (CSIT)
How We Know by Freeman Dyson,
published in The New York Review of Books, March 10, 2011