Information Theory : Three Theorems by Claude Shannon (Record no. 34020)

000 -LEADER
fixed length control field a
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 250711b xxu||||| |||| 00| 0 eng d
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
International Standard Book Number 9783031215605
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 004.0151
Item number CHA
100 ## - MAIN ENTRY--PERSONAL NAME
Personal name Chambert-Loir, Antoine
245 ## - TITLE STATEMENT
Title Information Theory : Three Theorems by Claude Shannon
260 ## - PUBLICATION, DISTRIBUTION, ETC. (IMPRINT)
Name of publisher, distributor, etc Springer,
Date of publication, distribution, etc 2022.
Place of publication, distribution, etc Cham :
300 ## - PHYSICAL DESCRIPTION
Extent xii, 209 p. ;
Other physical details ill., (b & w),
Dimensions 24 cm.
365 ## - TRADE PRICE
Price amount 64.99
Price type code
Unit of pricing 100.40
504 ## - BIBLIOGRAPHY, ETC. NOTE
Bibliography, etc Includes bibliographical Reference Index.
520 ## - SUMMARY, ETC.
Summary, etc This book provides an introduction to information theory, focussing on Shannon’s three foundational theorems of 1948–1949. Shannon’s first two theorems, based on the notion of entropy in probability theory, specify the extent to which a message can be compressed for fast transmission and how to erase errors associated with poor transmission. The third theorem, using Fourier theory, ensures that a signal can be reconstructed from a sufficiently fine sampling of it. These three theorems constitute the roadmap of the book. The first chapter studies the entropy of a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem). The third chapter proves the sampling theorem (Shannon's third theorem) and looks at its connections with other results, such as the Poisson summation formula. Finally, there is a discussion of the uncertainty principle in information theory. Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level.
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name as entry element Coding
Topical term or geographic name as entry element Information Theory
Topical term or geographic name as entry element Mathematics of Computing
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Source of classification or shelving scheme
Item type Books
Holdings
Withdrawn status Lost status Source of classification or shelving scheme Damaged status Not for loan Permanent location Current location Date acquired Source of acquisition Cost, normal purchase price Full call number Barcode Date last seen Koha item type
          DAU DAU 2025-05-26 KB 6525.00 004.0151 CHA 035559 2025-07-11 Books

Powered by Koha