Using a computer-like system made from engineered DNA, scientists have computed the square root of 900.

Biologists have proposed using genetic material for performing computations since as early as 1994. Since then, they’ve found ways to store bits of information in DNA and manipulate those bits via the same rules of logic that computers use. But, according to a recent paper** **in the journal Small*,* it’s difficult to integrate this logic into a circuit that can perform difficult mathematical operations. The researchers think their platform is a step toward a future of DNA-based computers that might even supplant silicon.

“DNA computing is still in its infancy, but holds great promises for solving problems that are too difficult or even impossible to handle by current silicon-based computers,” Chunlei Guo, one of the study’s authors from the University of Rochester, told Gizmodo in an email.

The computer is basically a vial of custom DNA strands designed to connect with more custom DNA strands that serve as the input, and then fluoresce with a combination of up to five different wavelengths of light, based on which DNA strands are present. Unlike your computer, which represents bits as the presence or lack of voltage in a transistor, this system represents each unique bit as the presence or lack of an entire corresponding strand of DNA. This means that to calculate the square root of 1, you just put in strand A, but calculating the square root of 484, which is represented in binary as 0111100100, would require inputting strands C, F, G, H, and I to represent the 1s and leaving out the A, B, D, E, and J strands to represent the 0s.

Based on these inputs, the platform fluoresces with one or more of five possible wavelengths of light—blue, orange, mustard, red, and green—which represent the five-digit outputs. The presence or absence of these wavelengths represents binary digits 1 and 0, respectively. So in the case of 484, they input 0111100100 (adding C, F, G, H, and I for the 1s and leaving out A, B, D, E, and J for the 0s), which results in an output of blue light, mustard light, and red light, but no green or orange light, to represent the five-bit binary number 10110 (aka 22, the square root of 484).

G/O Media may get a commission

Ten binary digits can represent numbers up to 1,023. The researchers were able to compute square roots up to 900, the highest perfect square that they could represent with this system.

This isn’t a calculator and cannot do math; it’s a single-purpose system that uses tables to translate a selection of DNA strands into a corresponding light pattern. It’s also just one of several different ways to turn DNA into a computer; other methods incorporate enzymes or self-assembling DNA strands.

Still, such a system is difficult to create and requires each input is specially encoded so as not to react with the other inputs or produce an erroneous result. The researchers hope that one day, based on this design concept, that they can perform more complex math.