While it might sound like a dumb idea, designing a computer processor that can make mistakes could be a good thing—especially where energy use is a concern.
Researchers from Rice University and Berkeley, along with collaborators in Europe and Singapore, have developed a new type of processor that is allowed to make occasional errors. That completely challenges the conventional wisdom of 50 years of chip manufacture—but the energy saving could make the mistakes worth it, reports Laboratory Equipment.
The idea is simple: cut power use by allowing processing components, like hardware used to multiply numbers, to make occasional mistakes. That makes it possible to trim away rarely used portions of the chip, which only ever come into action when mistakes occur—dramatically reducing power consumption.
By carefully reducing the probability of errors by choosing only certain types of operation that are allowed to make mistakes, it's possible to make a chip that is useable—but 15 times more efficient than normal processors. Avinash Lingamneni, one of the researchers, explains to Laboratory Equipment:
"In the latest tests, we showed that pruning could cut energy demands 3.5 times with chips that deviated from the correct value by an average of 0.25 percent. When we factored in size and speed gains, these chips were 7.5 times more efficient than regular chips. Chips that got wrong answers with a larger deviation of about 8 percent were up to 15 times more efficient."
Save big with this Samsung sale
If you’re ready to drop some cash on a TV, now’s a great time to do it. You can score the 75-inch Samsung Q70A QLED 4K TV for a whopping $800 off. That knocks the price down to $1,500 from $2,300, which is 35% off. This is a lot of TV for the money, and it also happens to be one of the best 4K TVs you can buy right now, according to Gizmodo.
The biggest question, though, is how the introduction of errors would affect user experience. Christian Enz, another researchers, explains:
"Particular types of applications can tolerate quite a bit of error. For example, the human eye has a built-in mechanism for error correction. We used inexact adders to process images and found that relative errors up to 0.54 percent were almost indiscernible, and relative errors as high as 7.5 percent still produced discernible images."
While it's almost certainly the case that such technology would never appear in computers or tablets, it's easy to believe that small, low-power devices could benefit from such chips. Just be prepared for slightly more erratic user experience. [ACM International Conference on Computing Frontiers via Laboratory Equipment]
Image by Avinash Lingamneni, Rice University, CSEM