Information Theory

Computation allow us to solve problems and binary allows us to use machines to do that. But then what? How do we actually use binary values and computational theory to get anything done?

In the 1930s and 40s one man figured it all out and kicked off the Information Age with a single paper. His name was Claude Shannon and he theorized how we can efficiently store, transmit, encode and correct errors using binary. He is one of the most important people in history, and very little is known about him.

Let’s change that!

entropy.jpg

Entropy and Quantifying Information

Now that we know how to use binary to create switches and digitally represent information we need to ask the obvious question: 'is this worthwhile'? Are we improving things and if so, how much?

16 minutes

encoding-lossless-compression.jpg

Encoding and Lossless Compression

Claude Shannon showed us how to change the way we encode things in order to increase efficiency and speed up information trasmission. We see how in this video.

27 minutes

error-correction-part-1.jpg

Correcting Errors in a Digital Transmission, Part 1

There are *always* errors during the transmission of information, digital or otherwise. Whether it's written (typos, illegible writing), spoken (mumbling, environment noise) or digital (flipped bits), we have to account for and fix these problems.

15 minutes