Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Exactly - the twitter thread would be really improved if everyone read an introduction to information theory. There are 1024 bits of information in a kilobit, and that's enough information for 2^1024 unique messages. The interpreter of the messages judges what the message information means, but the Kolmogorov complexity of a message is inescapable - you would need to modify the interpreter to squeeze out more messages (which is nothing more than pre-transferring information to the receiver)


Shannon’s insight into the hyperspheres of transmission blew my mind 15 years ago, and I’m still not over it.

The idea of an ideal communication channel rivals most ideas ever had. The original paper is surprisingly accessible. Highly recommended if anyone hasn’t had a chance to give it a tour.

And here’s the book, just to not be a tease: https://pure.mpg.de/rest/items/item_2383164/component/file_2...


This is a fantastic paper. Any tips on how to digest some of the more math-y parts of the paper? For example in understanding Theorem 1.


I guess I did take for granted that it includes a bit of calculus and discrete math. I have a little tip.

Once you refresh your knowledge of the greek alphabet, it’s not that scary. Math just needs a lot of symbols.

A good starting point would be infinite series, as this is the basis on which we imagine things bigger than a human lifetime. And it reminds us why we had to define a “limit” as a summation that strangely looks like ∑

It took me years to see the significance of arithmetic series, but as with any ∑, you have to start somewhere! (Often at -∞ and +∞)

The trick to all of it is that there are patterns declared by the underlying structures. There is not really a way to understand them without playing around with equalities and diagrams until you reach a certain zenlike moksha. Trust me, it is quite fulfilling!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: