Welcome to P K Kelkar Library, Online Public Access Catalogue (OPAC)

Amazon cover image
Image from Amazon.com

Information theory : from coding to learning

By: Contributor(s): Publication details: Cambridge University Press 2025 CambridgeDescription: xxiv, 724pISBN:
  • 9781108832908
Subject(s): DDC classification:
  • 003.54 P768i
Summary: This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC Bayes and variational principle, Kolmogorov's metric entropy, strong data processing inequalities, and entropic upper bounds for statistical estimation. Accompanied by a solutions manual for instructors, and additional standalone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science. Provides a systematic treatment of information-theoretic techniques in statistical learning and high-dimensional statistics Develops information theory for both continuous and discrete variables providing examples relevant to statistical and machine learning applications Focuses on finite block length (non-asymptotic) results, equipping students with information theory knowledge required for modern applications such as 6G and future network design Advanced material suitable for skipping on first reading is clearly indicated, enabling a fast introduction to fundamental concepts which can be enhanced with additional material on re-reading
List(s) this item appears in: New Arrival Aug 04 to 17, 2025
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Collection Call number Status Date due Barcode Item holds
Books Books PK Kelkar Library, IIT Kanpur On Display 003.54 P768i (Browse shelf(Opens below)) Available A186941
Total holds: 0

This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC Bayes and variational principle, Kolmogorov's metric entropy, strong data processing inequalities, and entropic upper bounds for statistical estimation. Accompanied by a solutions manual for instructors, and additional standalone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science.

Provides a systematic treatment of information-theoretic techniques in statistical learning and high-dimensional statistics
Develops information theory for both continuous and discrete variables providing examples relevant to statistical and machine learning applications
Focuses on finite block length (non-asymptotic) results, equipping students with information theory knowledge required for modern applications such as 6G and future network design
Advanced material suitable for skipping on first reading is clearly indicated, enabling a fast introduction to fundamental concepts which can be enhanced with additional material on re-reading

There are no comments on this title.

to post a comment.

Powered by Koha