Monday, April 14, 2008

Books from my Shelf, Part 1

Commenters have told me they want some book recommendations, so here I go... since there's a number of books, I'll be breaking this post up.

But first, a caveat, I'm not a big collector of academic books. I was spoiled by Berkeley's libraries as a graduate student, and am spoiled by Harvard and MIT's libraries now. But here are some of the books on my shelf that I find useful (besides, of course, the obvious Mitzenmacher and Upfal, which everyone must have on their shelf). Some are useful for theory, some for practice, some for... well, hopefully the descriptions will give some idea. In no particular order, here are the first 5, with more to come...

Handbook of Algorithms and Data Structures, Gonnet and Baeza-Yates: Apparently out of print, but well worth getting used. This is the state of the art in hashing, search algorithms, sorting algorithms, priority queues, text search, etc. as of 1990. And one of the best sources if you need to look up, say, the formulas for the expectation and variance for the number of accesses required under linear probing hashing to insert the (n+1)st element into a table, or pretty much anything else in that vein. Chock-full of formulas and references, this is my go-to book for looking up the basics. Someone should write an updated version of this classic.

Elements of Information Theory
, Cover and Thomas: This book is the classic standard introductory text for information theory. And it's just a darn fine book, well written and covering all the basics so you could, for example, skim it over and then go to information theory conferences and act as though you sort of belong.

Modern Coding Theory, Richardson and Urbanke: OK, it's not yet on my shelf, I still have to get a copy. This book is for everything you've wanted to know about low-density parity-check codes by two of the leading experts in the field. I'm not sure I'd recommend it generally to everyone, but I would to everyone interested in coding theory in general (and LDPC's in particular).

Information Theory, Inference, and Learning Algorithms, David MacKay: This book, which I've enjoyed since it came out, now lies somewhat interestingly between the above two information theory books. It's a bit more basic and broad, covering fundamentals of compression, coding, and Bayesian inference, while also covering topics like LDPC codes and neural networks. The book is a bit idiosyncratic, and naturally, since I enjoy David's work, I enjoy the book.

Complex Graphs and Networks, Chung and Lu: I like having this book around so I can pull up Chapter 2 for reference -- all about tail bounds, with an emphasis on variants of martingale bounds (Azuma's inequality) that cover more obscure situations that arise and are hard to find easily elsewhere.

1 comment:

Joe Blitzstein said...

Thanks for the great list. MacKay is one of my favorite books of all time and I love his page helping one decide whether to buy that or Harry Potter: http://www.inference.phy.cam.ac.uk/mackay/itila/Potter.html

In the idiosyncratic and packed with insight category I'd also mention David Williams' books (Probability with Martingales and Weighing the Odds).