Shortly after my recent post ragging on Dembski, new reader and mathematician coheleth
wrote me asking for some pointers on learning information theory. I also got an anonymous comment asking for the same thing (hello, anonymous reader!), and who am I to say no when someone wants to talk about math?
However, I know for a fact that I have a lot of really sharp readers, many of whom are way better at this kind of thing than I am. So, you are all cordially invited to participate in a paper-reading and discussion group, right here on this very LJ.
Why information theory? Well, because it's the unsung hero of the modern age. Every form of communication we take for granted today -- telephones, cellphones, radio, the Internet, wifi, Bluetooth, you name it -- has its feet firmly planted in information theory. Information theory also helps us to make sense of the world around us, from DNA to black holes. Understanding information theory will help you to be a better scientist, even if you're not one already. On a very fundamental level, information theory has a lot to say about what we can
know, what we can
do. It's theoretical math for realists.
I figure we'll start with one of the classics -- Claude Shannon's The Mathematical Theory of Communication
, available in several formats from the nice people at Bell Labs -- and work our way forward from there, letting the pace set itself. When I was doing this in grad school, we did a paper a week, reading on our own and meeting to discuss for a couple of hours once weekly. Since this is the internet, I figure discussions might go on for a couple of days, so my initial thinking is a paper every two weeks -- that's a week to read, a couple of days for discussion, then a breather before picking up the next one. But that's all assumption; I don't want to drag anyone away from a good discussion, nor do I want to rush anyone.
If you're curious about the subject but think that you're bad at math, then rejoice -- information theory is, to my mind at least, one of the easiest mathematical disciplines for laypeople to understand. It will help if you understand binary numbers
(or, better yet, how bases
work generally); a grasp of basic probability (i.e., how to compute the likelihood of a certain number coming up on a dice roll) will also be useful, as will ninth-grade algebra. You will also need to understand that the integral of a function is the area under the curve in the graph of that function (or, in three-space, the volume described by rotating that curve around an axis), and the general idea of summation
(including the notion of a convergent series
, which is an infinite series whose sum has a limit
, i.e., it does not grow without bound). But that's it. Seriously. (You don't have to actually understand how to compute an integral. Hell, calculus was fifteen years ago; I barely remember how to do one myself. I am way overspecialised in discrete math, and underequipped for continuous math.)
If anyone's interested but doesn't feel like they have the prerequisites down, I can post something in the next couple of days to get you up to speed; don't be shy.
Where we'll go from Shannon is anyone's guess, and depends mostly on where the discussion goes. We'll likely end up talking about coding theory and compression (as in, how ZIP files work, and how your cellphone is able to hold a reliable connexion without being clobbered by the thousands of other conversations going over the cell network). But we might also get into cryptography and cryptanalysis, information-theoretic security (as in, cryptosystems that can't be broken even if the attacker has all the computational power in the universe -- a favourite subject of enochsmiles
' and mine), astronomy (radio telescopes are completely dependent on information theory), and computability theory, that latter by way of Gregory Chaitin and algorithmic information theory.
My goal here is to deepen and broaden understanding on all levels. In a meatspace paper-reading group that's difficult, but since the net is distributed, I'm hoping that we can address the curiosity of newbies, experts and self-proclaimed "non-math people" alike. Feel free to invite your non-LJ friends, too. (They might want to get OpenID accounts, to make discussion threads easier, but that's certainly not a requirement.)
So! You've got the link, up there in the fourth paragraph; go forth and read. It's 55 pages, so I'm thinking we might want to start with just the first part (pages 1-19 inclusive). We should definitely follow up with part two; I'm less sanguine about part three, but if there's interest, we'll do it.
I'll kick off discussion next Wednesday with some questions and maybe an observation or three. I'm looking forward to having you join us!