Information and Complexity in Statistical Modeling (Electronic book text, 2007 ed.)


No statistical model is "true" or "false," "right" or "wrong"; the models just have varying performance, which can be assessed. The main theme in this book is to teach modeling based on the principle that the objective is to extract the information from data that can be learned with suggested classes of probability models. The intuitive and fundamental concepts of complexity, learnable information, and noise are formalized, which provides a firm information theoretic foundation for statistical modeling. Inspired by Kolmogorov's structure function in the algorithmic theory of complexity, this is accomplished by finding the shortest code length, called the stochastic complexity, with which the data can be encoded when advantage is taken of the models in a suggested class, which amounts to the MDL (Minimum Description Length) principle. The complexity, in turn, breaks up into the shortest code length for the optimal model in a set of models that can be optimally distinguished from the given data and the rest, which defines "noise" as the incompressible part in the data without useful information.

Such a view of the modeling problem permits a unified treatment of any type of parameters, their number, and even their structure. Since only optimally distinguished models are worthy of testing, we get a logically sound and straightforward treatment of hypothesis testing, in which for the first time the confidence in the test result can be assessed. Although the prerequisites include only basic probability calculus and statistics, a moderate level of mathematical proficiency would be beneficial. The different and logically unassailable view of statistical modelling should provideexcellent grounds for further research and suggest topics for graduate students in all fields of modern engineering, including and not restricted to signal and image processing, bioinformatics, pattern recognition, and machine learning to mention just a few.


Delivery AdviceNot available

Toggle WishListAdd to wish list
Review this Item

Product Description

No statistical model is "true" or "false," "right" or "wrong"; the models just have varying performance, which can be assessed. The main theme in this book is to teach modeling based on the principle that the objective is to extract the information from data that can be learned with suggested classes of probability models. The intuitive and fundamental concepts of complexity, learnable information, and noise are formalized, which provides a firm information theoretic foundation for statistical modeling. Inspired by Kolmogorov's structure function in the algorithmic theory of complexity, this is accomplished by finding the shortest code length, called the stochastic complexity, with which the data can be encoded when advantage is taken of the models in a suggested class, which amounts to the MDL (Minimum Description Length) principle. The complexity, in turn, breaks up into the shortest code length for the optimal model in a set of models that can be optimally distinguished from the given data and the rest, which defines "noise" as the incompressible part in the data without useful information.

Such a view of the modeling problem permits a unified treatment of any type of parameters, their number, and even their structure. Since only optimally distinguished models are worthy of testing, we get a logically sound and straightforward treatment of hypothesis testing, in which for the first time the confidence in the test result can be assessed. Although the prerequisites include only basic probability calculus and statistics, a moderate level of mathematical proficiency would be beneficial. The different and logically unassailable view of statistical modelling should provideexcellent grounds for further research and suggest topics for graduate students in all fields of modern engineering, including and not restricted to signal and image processing, bioinformatics, pattern recognition, and machine learning to mention just a few.

Customer Reviews

No reviews or ratings yet - be the first to create one!

Product Details

General

Imprint

Springer-Verlag New York

Country of origin

United States

Release date

2007

Availability

We don't currently have any sources for this product. If you add this item to your wish list we will let you know when it becomes available.

Authors

Format

Electronic book text

Pages

142

Edition

2007 ed.

ISBN-13

978-0-387-68812-1

Barcode

9780387688121

Categories

LSN

0-387-68812-9



Trending On Loot