Normal view MARC view ISBD view

Information measures : information and its description in science and engineering

By: Arndt, Christoph.
Material type: materialTypeLabelBookPublisher: Berlin: Springer-Verlag, 2004Description: xvi, 547 p.; index: 24 cm.ISBN: 9783540408550.Subject(s): Information measurement | Engineering | Telecommunication | Coding theoryDDC classification: 003.5421 Summary: This book is an introduction to the mathematical description of information in science and engineering. The necessary mathematical theory will be treated in a more vivid way than in the usual theorem-proof structure. This enables the reader to develop an idea of the connections between different information measures and to understand the trains of thoughts in their derivation. As there exist a great number of different possible ways to describe information, these measures are presented in a coherent manner. Some examples of the information measures examined are: Shannon information, applied in coding theory; Akaike information criterion, used in system identification to determine auto-regressive models and in neural networks to identify the number of neu-rons; and Cramer-Rao bound or Fisher information, describing the minimal variances achieved by unbiased estimators. This softcover edition addresses researchers and students in electrical engineering, particularly in control and communications, physics, and applied mathematics.
Tags from this library: No tags from this library for this title. Log in to add tags.

This book is an introduction to the mathematical description of information in science and engineering. The necessary mathematical theory will be treated in a more vivid way than in the usual theorem-proof structure. This enables the reader to develop an idea of the connections between different information measures and to understand the trains of thoughts in their derivation. As there exist a great number of different possible ways to describe information, these measures are presented in a coherent manner. Some examples of the information measures examined are: Shannon information, applied in coding theory; Akaike information criterion, used in system identification to determine auto-regressive models and in neural networks to identify the number of neu-rons; and Cramer-Rao bound or Fisher information, describing the minimal variances achieved by unbiased estimators. This softcover edition addresses researchers and students in electrical engineering, particularly in control and communications, physics, and applied mathematics.

There are no comments for this item.

Log in to your account to post a comment.

Powered by Koha