Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Introduction to Bayesian scientific computing : ten lectures on subjective computing
Calvetti D., Somersalo E., Springer-Verlag New York, Inc., Secaucus, NJ, 2007. Type: Book (9780387733937)
Date Reviewed: Feb 4 2008

This witty, erudite, and surprisingly practical book is made up of ten chapters. The first five address fundamentals (from the ground up, but without proofs), and the second five chapters provide elaborations or, in my words, advanced topics (though the entire book belongs in the advanced category). One off-the-bat surprise was that I found the intentional omission of proofs, in favor of what the authors convincingly explain as exactness, to be a crucial component of the book’s effectiveness—an effectiveness also supported by pithy content, excellent writing, and smooth flow. The Springer series, of which this book is the second volume, “provides an outlet for material less formally presented ... than finished texts or monographs, yet of immediate interest.” This book fits that characterization very well.

A central topic of the book is the relationship between statistical inference and the inverse problems that define Bayesian (subjective) statistics. I urge potential readers not to skip this book on the basis that the subtitle has the word “subjective”; your scientific education and enrichment will suffer if you do so. (The authors do not use this off-putting word in, for example, the same way that today’s dilettantes claim to incorporate consciousness into quantum mechanics.)

As is the case with the vast majority of students and professionals in science and engineering, my exposure to probability and statistics was with Feller’s [1] style in the case of probability, and in that body of conventional statistical techniques whose mathematics is summarized in, for example, Hogg and Craig [2]. “Frequentist” is the name given to this majority view. A serendipitous discovery, later in life than I care to make public, was a powerful eye-opener regarding the philosophies of Laplace [3], Bayes, H. Jeffreys [4], J.M. Keynes [5] as probabilist, R. Cox [6], and B. De Finetti [7], to whose legacy this book is dedicated. This enumeration is not meant to suggest total agreement among these leading lights, but only a degree of commonality based on the Laplace-Bayes view of statistical inference, with which the highly charged word “subjective” has, for better or worse, been associated in modern times.

Chapter 1, “Inverse Problems and Subjective Computing,” introduces one (preparatory) characterization of randomness (elaborated in chapter 3): lack of information. The authors also complement this characterization with a good but gentle exposition of Kolmogorov’s axiomatic basis of probability, and a clear explication of the notions of random variable, probability space, and state space.

Chapter 2, “Basic Problems of Statistical Inference,” states the fundamental problem of statistical inference, mainly from the frequentist perspective, and gives a clear distinction between the parametric and nonparametric approaches. Two roles of maximum likelihood (ML) estimation are introduced: first, as a basic problem in linear algebra, and, second, as the “key demarcation line between the frequentist and the Bayesian interpretation” of the parameter estimates. “Credibility intervals” are introduced as the Bayesian counterpart of confidence intervals.

Chapter 3, “The Praise of Ignorance; Randomness as Lack of Information,“ is the Bayesian answer to the previous chapter: “[R]andomness simply means a lack of information.” Any quantity that is not known exactly is a random variable: this includes parameters (such as the mean or variance), which are seen as “realizations of random variables.”

Chapter 4 is an excellent summary of numerical linear algebra, and includes approaches to ill-conditioned problems. The cadence will give people like me renewed courage to attack the more troublesome numerical analysis problems that so often end in frustration and defeat. This chapter lays the groundwork for Bayesian conditioning of equations that involve error-laden data.

Chapter 5, “Sampling: First Encounter,” is a sophisticated treatment of sampling proper from a given distribution, starting with a prior belief regarding that distribution, and (somewhat separately) addressing the use of sampling in multi-dimensional numerical integration. The principal idea of the subsequent chapter 6, “Statistically Inspired Preconditioners,” is how Bayes helps linear algebra through the modification of the linear system using prior information.

My favorite, Section 7.3, “Envelopes, White Swans and Dark Matter,” is in the chapter titled “Conditional Gaussian Densities and Predictive Envelopes.” The belief envelope (Figure 7.4) associated with the question of extrapolating, to very small and to very large masses, gravity’s proportionality to mass is “amazingly wide.” Missing dark matter may indeed be the proverbial white swan.

The final three chapters cover more Gaussian conditioning, “Sampling: The Real Thing,” hypermodels, and Bayesian learning, as these topics were, dare I say, preconditioned by the previous chapters.

This excellent book will be valuable to scientists of various stripes, statisticians, numerical analysts, those who work in image processing, and those who implement Bayesian belief nets.

Reviewer:  George Hacken Review #: CR135218 (0811-1048)
1) Feller, W. An introduction to probability theory and its applications (2nd ed.). Wiley, New York, NY, 1957.
2) Hogg, R.V.; Craig, A.T. Introduction to mathematical statistics (5th ed.). Prentice Hall, Englewood Cliffs, NJ, 1995.
3) Jaynes, E.T.; Bretthorst, G.L. (Ed.) Probability theory: the logic of science. Cambridge University Press, New York, NY, 2003.
4) Jeffreys, H. Theory of probability (3rd ed.). Oxford University Press, New York, NY, 1998.
5) Keynes, J.M. A treatise on probability. Dover Publications, Mineola, NY, 2004.
6) Cox, R.T. The algebra of probable inference. Johns Hopkins University Press, Baltimore, MD, 2001.
7) De Finetti, B. Theory of probability; a critical introductory treatment. Wiley, New York, NY, 1974.
Bookmark and Share
  Featured Reviewer  
 
Probability And Statistics (G.3 )
 
 
General (F.2.0 )
 
 
Modes Of Computation (F.1.2 )
 
Would you recommend this review?
yes
no
Other reviews under "Probability And Statistics": Date
Probabilities from fuzzy observations
Yager R. (ed) Information Sciences 32(1): 1-31, 1984. Type: Article
Mar 1 1985
Randomness conservation inequalities; information and independence in mathematical theories
Levin L. Information and Control 61(1): 15-37, 1984. Type: Article
Sep 1 1985
The valuing of management information. Part I: the Bayesian approach
Carter M. Journal of Information Science 10(1): 1-9, 1985. Type: Article
May 1 1986
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy