For the First Time an AI Machine Identified Galaxies All on Its Own

Illustration for article titled For the First Time an AI Machine Identified Galaxies All on Its Own

Researchers in the UK have developed a computer that can scan outer space and classify galaxy types on its own, without any human help. This image recognition AI could help develop robots that can “see” better on their own, possibly helping doctors spot tumors or airport security spot firearms.


For example, in the image above, the computer can figure out which galaxies are the “elliptical” type (the yellow specks) and which are spiral, star-forming ones (the blue specks). This is a big deal, because the machine intuitively does what humans do, except way faster and free of supervision. Using gizmos to classify images in space isn’t anything new, but this is the first time a machine can do so on its own.

The process is called, appropriately enough, “unsupervised machine learning.” This particular project was done by astronomers and computer scientists at the University of Hertfordshire. The AI pulls from Hubble Telescope images and classifies the galaxies automatically using an algorithm that took about a year to develop.

Using this technique, the machine “separates” individual objects that are in a larger environment—and then getting better at identifying those objects over time. Last week, we reported Microsoft doing something similar, using augmented reality: a robot could eventually learn to immediately identify a coffee mug from other objects, just as a human would.

“The key novel aspect is that it is ‘unsupervised’, where we have taught the machine the basic principles of how to ‘look’ at the image,” Jim Geach, one of the researchers, told Gizmodo. But the algorithm could be even sharper than humans, because it could detect abnormalities people might not.

Down the road, the team wants to get collaborators on board to apply this technique to other uses: helping self-driving cars navigate their surroundings, security personnel find suspect items in scans, helping doctors locate tumors.

“That could include ultrasound, micrographs, CAT scans, MRI—really any imaging data set where one might be looking for patterns. Again, the key thing is that this algorithm could search for very subtle features buried in the data that a human might miss,” Jim Geach, one of the researchers, told Gizmodo. “All the time the machine could also be learning, so it continuously improves ‘what it knows’, thus making it better at discovering abnormalities.”


Image via Royal Astronomical Society




I’m not an astronomer, although I sometimes play one on Gawker media, but I suspect this will be a very useful tool for astronomy. It will be doing data mining on the staggeringly huge amount of data our telescopes, both ground and space based, gather for us. Not just galaxies but perhaps other types of celestial objects as well. Asteroid discovery and tracking. New types of blink comparometry. Who knows?