UMBC Center for Information Security and Assurance


Information Theory - ENEE 622

Course Description

Shannon's information measures: entropy, differential entropy, information divergence, mutual information and their basic properties. Entropy rates, asymptotic equipartition property, weak and strong typicality, joint typicality, Shannon's source coding theorem and its converse, prefix-free and uniquely decodable source codes, Huffman and Shannon codes, universal source coding, source-coding with a fidelity criterion, the rate-distortion function and its achievability, channel capacity and its computation, Shannon's channel coding theorem, strong coding theorem, error exponents, Fano's inequality and the converse to the coding theorem, feedback capacity, joint source channel coding, discrete-time additive Gaussian channels, the covering lemma, continuous-time additive Gaussian channels, parallel additive Gaussian channels and waterfilling. Additional topics: narrow-band time-varying channels, fading channels, side information, wideband channels, network coding, information theory in relation to statistics and geometry.

Prerequisites: Strong grasp of basic probability theory.

Previous Offerings

  • Spring 2012 (La Berge)
  • Spring 2011 (Morris)
  • Spring 2010 (Morris) (Syllabus)
  • Spring 2009 (Chang)
  • Fall 2007 (Chang)
  • Spring 2007 (Chang)
  • Spring 2006 (La Berge)
  • Spring 2005 (Chang)
  • Spring 2004 (Morris)
  • Spring 2003 (Thomas)