Fundamentals of Information Processing

What is information? In 1948, Claude E. Shannon established a theory that revolutionized the design of communication systems: All communication and storage systems, such as radio, telephone, satellite, and hard disk drive, can be unified through a simple mathematical model; There is a “common commodity” associated with any communication system, called “information,” which can be represented in binary form, or a sequence of bits. In this course, we walk students through this groundbreaking discovery that opened up the age of information and study 1) how to remove redundancy and represent information in the most efficient manner (compress information); 2) how to add redundancy and convey information through a noisy medium (transmit information); and 3) how Shannon’s way of thinking can influence the new era of artificial intelligence (learn information).

3 credits

Tentative Outline

Information compression

  • Huffman codes
  • Shannon-Fano codes
  • Tunstall codes
  • Entropy of source
  • Lempel-Ziv codes

Information transmission

  • Channel capacity
  • Mutual information
  • Binary symmetric channel
  • Gaussian channel
  • Hamming codes
  • Turbo codes
  • LDPC codes
  • Polar codes

Information theory and machine learning

  • Maximum entropy principle
  • Maximum likelihood principle
  • Empirical risk minimizaiton principle


Working knowledge of calculus, linear algebra, and probability.

Assessment Scheme

  • Weekly homework assignment: Homework will be normally be assigned each Friday and due the following Friday in class.  Late homework will never be accepted.  If you use materials other than the textbooks and lecture notes – this applies to having discussion with classmates or searching the Internet – the source should be clearly mentioned; otherwise, it is considered as cheating.
  • Midterm exam: In-class midterm; open book open notes.
  • Final exam: In-class final; open book open notes.
  • Reading assignment: Required for graduate students and optional for undergraduate students.  Each student will be assigned one paper for thorough reading, comprehensive understanding, and critical thinking.  Creative ideas for future work of the problem in the paper are encouranged, but not required.  In the last week, each student gives a 15-minute presentation in class and submits a less than 3-page report (in IEEE double column style) synthesizing their thoughts of the paper.  Evaluation criterion include the clarity, accuracy, precision, conciseness, depth, and logic of the presentation of the report.


There is no required textbook. Here are some recommended references.

  • Thomas M. Cover and Joy A. Thomas, Elements of Information Theory, 2nd Edition, Wiley, 2006
  • Solomon W. Golomb, Robert E Peile, and Robert A. Scholtz, Basic Concepts in Information Theory and Coding: The Adventures of Secret Agent 00111, Plenum Press, 1994


  • Homework 30%
  • Midterm 30%
  • Final 40%
  • Reading (bonus) 5%

More Information

UBC Course Page