The Transformability of Information

The Great Courses
Show More

Related videos

The Science of Information - From Language to Black Holes
The science of information is the most influential, yet perhaps least appreciated field in science today. Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every…
Horse Races and Stock Markets
One of Claude Shannon's colleagues at Bell Labs was the brilliant scientist and brash Texan John Kelly. Explore Kelly's insight that information is the advantage we have in betting on possible alternatives. Apply his celebrated log-optimal strategy to horse racing and stock trading.
Cryptanalysis and Unraveling the Enigma
Unravel the analysis that broke the super-secure Enigma code system used by the Germans during World War II. Led by British mathematician Alan Turing, the code breakers had to repeat their feat every day throughout the war. Also examine Claude Shannon's revolutionary views on the nature of secrecy.
Entropy and the Average Surprise
Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise.
The Meaning of Information
Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography--designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, "What does the theory of information leave out?"
It from Bit: Physics from Information
Physicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status of information in a place of unlimited density. Also explore the information content of the entire universe!
Intimate Relationships
Part of the Series: Middle Adulthood
Researchers have concluded that throughout adulthood, throughout the world, marriage is the single familial relationship most closely linked to personal happiness, health and companionship. It's no wonder, then, that the majority of middle-aged adults consider their spouse or partner their closest friend. We examine the current research on these intimate…
Computation and Logic Gates
Accompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy analog computing devices of the day. Drawing on Shannon's ideas, learn how to design a simple electronic circuit that performs…
How Context Influences Choice
Uncover the many unexpected ways in which information context affects our decisions. What happens when our mind acts like a big shovel, scooping up all the data in its path and processing that information together, whether relevant or not? Scientific studies reveal how the choice context can lead to some…
P.S. I Can't Breathe (Black Lives Matter)
"P.S. I Can't Breathe" welcomes dialogue around racial inequality, policing, and the Criminal Justice System by focusing on Eric Garners case. We hope viewers will increase their understanding of issues plaguing Black and Brown Communities by witnessing a massive group of protesters unite for the purpose of justice. Eric Garners…
Data Compression and Prefix-Free Codes
Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a text such as Conan Doyle's The Return of Sherlock Holmes.
Noise and Channel Capacity
One of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error correction is possible and can be built into a message with only a modest slowdown in transmission rate.