Sections

PHYS 4058 - Information Physics

Right Column
Text Area

Course Description

This course explores the connections between information theory and physics. Information theory was developed by Shannon in the 1940s as a tool for optimizing communication systems in telephone networks. But how is the concept of entropy used by communications engineers related to that introduced a century earlier in thermodynamics and statistical mechanics? And what does information theory tell us about the physical limits of computation? Topics studied include communication systems, probability and random variables, discrete information sources, information and entropy, joint and conditional entropy, relative entropy and mutual information, capacity of a noiseless channel, source coding, capacity of a noisy channel, Bayesian probability, maximum entropy and thermodynamics, and Maxwell’s demon.

 

Teaching Pattern

  • Duration of course: about 13 weeks

  • Lecture hour(s) / tutorial hour(s) per week: 3 / 1

 

Content

  • Probability theory
  • Entropy in information theory
  • Relative entropy and mutual information
  • Second law of thermodynamics
  • Instantaneous code and block code
  • Data compression: Huffman code, portfolio management
  • Introduction to Mathematical Finance: Options and Binomial tree

 

Remarks

  • Prerequisite: PHYS 3031 or PHYS 4050

 

Sample Course Outline