David MacKay’s “Information Theory, Inference, and Learning Algorithms”

sept2003cover25.gif

This is an easy book for me to recommend. David J.C. MacKay is a professor in the physics department of Cambridge University, and he is a polymath who has made important contributions in a wide variety of fields. This textbook is an excellent introduction to modern error-correcting codes, compression, statistical physics, and neural networks. It is tied together by a recurring appeal to the power of Bayesian methods.

David wrote this book over the course of many years, publishing his drafts on the web. You can still view the entire book on the web here. But the book is very inexpensive; unless you’re very poor, you’ll really want to buy a copy.

As Bob McEliece (a professor at Caltech and Shannon medalist) wrote, “you’ll want two copies of this astonishing book, one for the office and one for the fireside at home.” I know this is true because I actually have two copies; I bought my own copy as soon as the book was published, and then found that David had kindly sent me a copy.

Tags: , , , , , , ,

One Response to “David MacKay’s “Information Theory, Inference, and Learning Algorithms””

  1. Gallager’s LDPC error-correcting codes « Nerd Wisdom Says:

    […] I’ll only give the briefest of introductions. [Some excellent textbooks I recommend include MacKay’s textbook which I’ve already reviewed, McEliece’s “Theory of Information and Coding”, and Lin and Costello’s […]

Leave a comment