This book presents a succinct and mathematically rigorous treatment of the main pillars of Shannon's information theory, discussing the fundamental concepts and indispensable results of Shannon's mathematical theory of communications. It includes five meticulously written core chapters (with accompanying problems), emphasizing the key topics of information measures; lossless and lossy data compression; channel coding; and joint source-channel coding for single-user (point-to-point) communications systems. It also features two appendices covering necessary background material in real analysis and in probability theory and stochastic processes.
The book is ideal for a one-semester foundational course on information theory for senior undergraduate and entry-level graduate students in mathematics, statistics, engineering, and computing and information sciences. A comprehensive instructor's solutions manual is available.
An Introduction to Single-User Information Theory
Introduction.- Information Measures for Discrete Systems.- Lossless Data Compression.- Data Transmission and Channel Capacity.- Differential Entropy and Gaussian Channels.- Lossy Data Compression and Transmission.