Not all data are created equal. But how much information is any piece of data likely to contain? This question is central to medical testing, designing scientific experiments, and even to everyday human learning and thinking. MIT researchers have developed a new way to solve this problem, opening up new applications in medicine, scientific discovery, cognitive science, and artificial intelligence.
In theory, the 1948 paper, "A Mathematical Theory of Communication," by the late MIT Professor Emeritus Claude Shannon answered this question definitively. One of Shannon's breakthrough results is the idea of entropy, which lets us quantify the amount of information inherent in any random object, including random variables that model observed data. Shannon's results created the foundations of information theory and modern telecommunications. The concept of entropy has also proven central to computer science and machine learning.
View Full Article
No entries found