Counting wrong

by Y. William Yu
MIT Mathematics graduate student
Posted on September 7th, 2014
No Comments

Have you ever had to count an absolutely ginormous number of things and thought to yourself: eh, who cares if I’m off by a couple? If so, you’re not the only one. It turns out that because numbers can get really big, even computers, as accurate and precise as they are, have trouble too. This turns out to be an important problem in analyzing Big Data, the explosion of information on everything in the world from Internet traffic patterns to sequencing the human genome.

Luckily, although counting exactly right is hard, counting slightly wrong is much, much easier. By the use of clever algorithms, computers can approximately count large numbers of items, getting a pretty good answer that’s guaranteed to probably be off by only a little. Although it sounds a bit weird to describe something with the phrase “guaranteed to probably”, many of these algorithms make extensive use of random coin flips to work. Much like how if you flip a coin a million times, you’ll probably get around half heads, but not exactly, these algorithms get answers that are pretty close most of the time. By harnessing the power of randomness, we can accurately and precisely count things wrong.

About the author
Y. William Yu
MIT Mathematics graduate student

Discuss