000 a
999 _c33057
_d33057
008 240319b xxu||||| |||| 00| 0 eng d
020 _a9789811084317
082 _a621.382
_bGAZ
100 _aGazi, Orhan
245 _aInformation theory for electrical engineers
260 _bSpringer,
_c2018
_aSingapore :
300 _aix, 276 p. ;
_bill.,
_c25 cm
365 _b79.99
_c
_d93.50
490 _aSignals and Communication Technology,
_v1860-4862
504 _aIncludes bibliographical references and index.
520 _aThis book explains the fundamental concepts of information theory, so as to help students better understand modern communication technologies. It was especially written for electrical and communication engineers working on communication subjects. The book especially focuses on the understandability of the topics, and accordingly uses simple and detailed mathematics, together with a wealth of solved examples. The book consists of four chapters, the first of which explains the entropy and mutual information concept for discrete random variables. Chapter 2 introduces the concepts of entropy and mutual information for continuous random variables, along with the channel capacity. In turn, Chapter 3 is devoted to the typical sequences and data compression. One of Shannon's most important discoveries is the channel coding theorem, and it is critical for electrical and communication engineers to fully comprehend the theorem. As such, Chapter 4 solely focuses on it. To gain the most from the book, readers should have a fundamental grasp of probability and random variables; otherwise, they will find it nearly impossible to understand the topics discussed.
650 _aCoding theory
650 _aElectrical engineering
650 _aInformation theory
942 _2ddc
_cBK