I find entropy fascinating—especially the relationship between it and information. I just read this book about Claude Shannon, who invented information theory, and it basically unified everything I’d known about it previously. Here’s how I’d capture my current understanding: Entropy is basically a measure of disorder For the universe, the ultimate disorder is heat death, meaning there are… — If you like my content, you can support it directly for less than a latte a month ($50/year) which also g...