Probability is the basis of not only statistics, but of almost all science, including mathematics, physics, chemistry, biology, engineering, marketing, etc.

In general, we rely on a number of principles of randomness. Entropy is a measure of disorder or randomness. An example of entropy is our reliance on air molecules spreading themselves randomly. This is critically important to our ability to breathe. If air typically arranged itself in a pattern that excluded part of a room, some people would suffocate. Fortunately, nature is somewhat random.

This foundational subject is covered in a nice introduction by Samuel Goldberg.

__Probability: An Introduction__was originally published in 1960, but has been republished by Dover since 1986. The current cover price is only $16.95. You get a lot of bang for your buck with this book.

Although the style is stodgy, the information is solid. The book contains five chapters, which cover sets, probability in finite sample spaces, sophisticated counting (which is combinations and permutations), random variables (which are functions), and binomial distributions.

After reading this book, you will understand the theoretical basis of an introduction to statistics. The book covers the theory behind expected values (means), variance and standard deviations, z values in a normal distribution, as well as much more. This is the best explanation I have ever received on combinations and permutations. Overall, reading the book has been very satisfying.

If you can get past the style, this book is worth the time and money.

Happy reading!

## No comments:

## Post a Comment