The Science of Information
From Language to Black Holes

The Great Courses
Show More

24 episodes in this series

Episode 1 The Transformability of Information
What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was…
Episode 2 Computation and Logic Gates
Accompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy…
Episode 3 Measuring Information
How is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information…
Episode 4 Entropy and the Average Surprise
Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples,…
Episode 5 Data Compression and Prefix-Free Codes
Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the…
Episode 6 Encoding Images and Sounds
Learn how some data can be compressed beyond the minimum amount of information required by the entropy of the source. Typically used for images, music, and video, these techniques drastically…
Episode 7 Noise and Channel Capacity
One of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error…
Episode 8 Error-Correcting Codes
Dig into different techniques for error correction. Start with a game called word golf, which demonstrates the perils of mistaking one letter for another and how to guard against it.…
Episode 9 Signals and Bandwidth
Twelve billion miles from Earth, the Voyager spacecraft is sending back data with just a 20-watt transmitter. Make sense of this amazing feat by delving into the details of the…
Episode 10 Cryptography and Key Entropy
The science of information is also the science of secrets. Investigate the history of cryptography starting with the simple cipher used by Julius Caesar. See how entropy is a useful…
Episode 11 Cryptanalysis and Unraveling the Enigma
Unravel the analysis that broke the super-secure Enigma code system used by the Germans during World War II. Led by British mathematician Alan Turing, the code breakers had to repeat…
Episode 12 Unbreakable Codes and Public Keys
The one-time pad may be in principle unbreakable, but consider the common mistakes that make this code system vulnerable. Focus on the Venona project that deciphered Soviet intelligence messages encrypted…
Episode 13 What Genetic Information Can Do
Learn how DNA and RNA serve as the digital medium for genetic information. Also see how shared features of different life forms allow us to trace our origins back to…
Episode 14 Life’s Origins and DNA Computing
DNA, RNA, and the protein molecules they assemble are so interdependent that it's hard to picture how life got started in the first place. Survey a selection of intriguing theories,…
Episode 15 Neural Codes in the Brain
Study the workings of our innermost information system: the brain. Take both top-down and bottom-up approaches, focusing on the world of perception, experience, and external behavior on the one hand…
Episode 16 Entropy and Microstate Information
Return to the concept of entropy, tracing its origin to thermodynamics, the branch of science dealing with heat. Discover that here the laws of nature and information meet. Understand the…
Episode 17 Erasure Cost and Reversible Computing
Maxwell's demon has startling implications for the push toward ever-faster computers. Probe the connection between the second law of thermodynamics and the erasure of information, which turns out to be…
Episode 18 Horse Races and Stock Markets
One of Claude Shannon's colleagues at Bell Labs was the brilliant scientist and brash Texan John Kelly. Explore Kelly's insight that information is the advantage we have in betting on…
Episode 19 Turing Machines and Algorithmic Information
Contrast Shannon's code- and communication-based approach to information with a new, algorithmic way of thinking about the problem in terms of descriptions and computations. See how this idea relates to…
Episode 20 Uncomputable Functions and Incompleteness
Algorithmic information is plagued by a strange impossibility that shakes the very foundations of logic and mathematics. Investigate this drama in four acts, starting with a famous conundrum called the…
Episode 21 Qubits and Quantum Information
Enter the quantum realm to see how this revolutionary branch of physics is transforming the science of information. Begin with the double-slit experiment, which pinpoints the bizarre behavior that makes…
Episode 22 Quantum Cryptography via Entanglement
Learn how a feature of the quantum world called entanglement is the key to an unbreakable code. Review the counterintuitive rules of entanglement. Then play a game based on The…
Episode 23 It from Bit: Physics from Information
Physicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status…
Episode 24 The Meaning of Information
Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography--designing an understandable message for future humans or alien civilizations. Close by…

Related videos

Architects: Zaha Hadid and Patrick Schumacher
Patrick Schumacher - Partner Dame Zaha Mohammad Hadid, DBE (born 31 October 1950) is an Iraqi-British architect. She received the Pritzker Architecture Prize in 2004--the first woman to do so--and the Stirling Prize in 2010 and 2011. Her buildings are distinctively futuristic, characterized by the "powerful, curving forms of her…
Inexplicable Cosmology
Brace yourself for a fascinating trip through dimensions higher than our own. You'll discover how quantum physics and string theory have opened our eyes to the possibilities of quantum foam, the multiverse, and antimatter. Join Professor Tyson for a preview of the fate of the universe (including the collision of…
Curved Spacetime and Black Holes
By developing a general theory of relativity incorporating gravity, Einstein launched a revolution in our understanding of the universe. Trace how his idea that gravity results from the warping of spacetime led to the discovery of black holes and the big bang.
It from Bit: Physics from Information
Physicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status of information in a place of unlimited density. Also explore the information content of the entire universe!
Entropy and the Average Surprise
Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise.
The Transformability of Information
What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit--the basic unit of information.
The Meaning of Information
Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography--designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, "What does the theory of information leave out?"
Hans Eysenck on Behavior Therapy
Never one to shy away from provocation, Hans Eysenck is often regarded as an iconoclast in the field of psychotherapy. In 1952, he penned a very controversial paper in which he famously argued that psychoanalytic psychotherapy had not been shown to be more effective in facilitating recovery from "neurotic disorder…
Data Compression and Prefix-Free Codes
Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a text such as Conan Doyle's The Return of Sherlock Holmes.
Black Hole Entropy
Stephen Hawking showed that black holes emit radiation and therefore have entropy. Since the entropy in the universe today is overwhelmingly in the form of black holes and there were no black holes in the early universe, entropy must have been much lower in the deep past.
Inside Einstein's Mind
Part of the Series: NOVA
On November 25th, 1915, Einstein published his greatest work: general relativity. The theory transformed our understanding of nature's laws and the entire history of the cosmos, reaching back to the origin of time itself. Now, in celebration of the 100th anniversary of Einstein's achievement, NOVA tells the inside story of…
Medical Training to be a Doctor
A series of videos containing useful information and training for the medical field.