Processors are very accurate. An MIT computer scientist says making them more error-prone could mean faster, more powerful computers
Source: Bloomberg Business Week
By Drake Bennett
What's the square root of 10? If you had to do it in your head, you might say "a little more than three." Computers, unlike humans, don't do back-of-the-envelope calculations. They just crunch the numbers to the last requested decimal place.
Joseph Bates, however, thinks we'd be better off if we were to let computers make some mistakes. Bates, 55, a computer scientist at Carnegie Mellon and the MIT Media Lab, has designed a chip that does what computer engineers call "sloppy arithmetic," or guesstimating. Slightly inaccurate chips would be "much, much littler and much, much more efficient" than current chips, he says. Accurate calculation is a series of discrete tasks, such as carrying numbers when summing figures, that take up valuable processing power. By ignoring some of those tasks, Bates's chip, he predicts, would have something like 100,000 times the computing power of a traditional processor.
With an error range around 1 percent, Bates's chip wouldn't be wildly inaccurate: One plus one might equal 2.02. In many applications, the resulting errors would either be imperceptible or automatically corrected. In digital photography and medical imaging, for instance, errors in the range of 1 percent would be invisible to the human eye. With other tasks, such as needle-in-a-haystack searches for particular images or sound files, Bates's chip could rifle through enormous databases, winnowing the list down to a few candidates for a more deliberate processor—or human being—to pick from. Bates foresees his chips being paired with traditional Intel (INTC)-style chips for this purpose. The result: smartphones with the computing power of desktops, and desktops with the power of supercomputers.
While he hasn't fabricated a sloppy chip yet, Bates sees the engineering as fairly basic. There's a consensus among chip engineers that, as Bob Colwell, formerly the chief designer of Intel's Pentium chips, puts it, "whatever challenges are down the hardware path are probably overcome-able." Bates says several companies are looking at the technology, though nondisclosure agreements prevent him from naming them.
Single and looking. Email me
Bates's central research interest has always been artificial intelligence—like many researchers, he came to the topic by reading Isaac Asimov as a boy. Growing up in Baltimore, he skipped high school and went to Johns Hopkins University at 13. In some of his earliest research, he tried to get computers to think like creative human mathematicians, to do the equivalent of word problems rather than the abstract language of sets and equations. His turn to sloppy arithmetic follows in this vein: Part of its promise is that it could help computers act more like the human brain, which takes all sorts of shortcuts to answer problems. "By allowing things to be approximate, you're a lot closer" to achieving true artificial intelligence, says Bates.