Computation and Logic Gates

The Great Courses
Show More

Related videos

Turing Machines and Algorithmic Information
Contrast Shannon's code- and communication-based approach to information with a new, algorithmic way of thinking about the problem in terms of descriptions and computations. See how this idea relates to Alan Turing's theoretical universal computing machine, which underlies the operation of all digital computers.
The Science of Information - From Language to Black Holes
The science of information is the most influential, yet perhaps least appreciated field in science today. Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every…
Horse Races and Stock Markets
One of Claude Shannon's colleagues at Bell Labs was the brilliant scientist and brash Texan John Kelly. Explore Kelly's insight that information is the advantage we have in betting on possible alternatives. Apply his celebrated log-optimal strategy to horse racing and stock trading.
Cryptanalysis and Unraveling the Enigma
Unravel the analysis that broke the super-secure Enigma code system used by the Germans during World War II. Led by British mathematician Alan Turing, the code breakers had to repeat their feat every day throughout the war. Also examine Claude Shannon's revolutionary views on the nature of secrecy.
The Transformability of Information
What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit--the basic unit of information.
Life’s Origins and DNA Computing
DNA, RNA, and the protein molecules they assemble are so interdependent that it's hard to picture how life got started in the first place. Survey a selection of intriguing theories, including the view that genetic information in living cells results from eons of natural computation.
Entropy and the Average Surprise
Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise.
Conditionals and Boolean Expressions
Episode 3 of How to Program
Any time a computer takes different paths depending on your response, there is usually a conditional statement involved. Delve into these widely used tools, looking at branching points, comparisons, if/then statements, nesting conditionals, and Boolean (true/false) expressions.
Shannon Bennett "Vue De Monde" [Melbourne] & Guillaume Brahimi "Guillaume at Bennelong" [Sydney]
Part of the Series: Master Chef Confidential Series 2
VUE DE MONDE - Melbourne Shannon Bennett Already a Melbourne institution, Bennett studied many years under London masters before applying his craft. BENNELONG @ SYDNEY OPERA HOUSE Guillaume Brahimi A disciple of the legendary Joel Robuchon [he even completed 3 stages at Jamin!], Brahimi brings his Parisien technique to an…
The Meaning of Information
Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography--designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, "What does the theory of information leave out?"
Signals and Bandwidth
Twelve billion miles from Earth, the Voyager spacecraft is sending back data with just a 20-watt transmitter. Make sense of this amazing feat by delving into the details of the Nyquist-Shannon sampling theorem, signal-to-noise ratio, and bandwidth--concepts that apply to many types of communication.
Noise and Channel Capacity
One of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error correction is possible and can be built into a message with only a modest slowdown in transmission rate.