Tech. Science. Culture.
We may earn a commission from links on this page

University Loses Valuable Supercomputer Research After Backup Error Wipes 77 Terabytes of Data

Kyoto University in Japan recently suffered a technical error that wiped out a whole lot of valuable information.

We may earn a commission from links on this page.
Image for article titled University Loses Valuable Supercomputer Research After Backup Error Wipes 77 Terabytes of Data
Photo: STR/JIJI PRESS/AFP (Getty Images)

Kyoto University, a top research institute in Japan, recently lost a whole bunch of research after its supercomputer system accidentally wiped out a whopping 77 terabytes of data during what was supposed to be a routine backup procedure.

That malfunction, which occurred sometime between Dec. 14 and Dec. 16, erased approximately 34 million files belonging to 14 different research groups that had been using the school’s supercomputing system. The university operates Hewlett Packard Cray computing systems and a DataDirect ExaScaler storage system—the likes of which can be utilized by research teams for various purposes.

It’s unclear what kind of files were specifically deleted or what caused the actual malfunction, though the school has said that the work of at least four different groups will not be able to be restored.

Advertisement

BleepingComputer, which originally reported on this incident, helpfully points out that supercomputing research is, uh, not super cheap, either—costing somewhere in the neighborhood of hundreds of dollars per hour to operate.

Kyoto, which is one of the most highly-regarded schools in Japan and receives significant grants and funding, originally published details about its unfortunate incident in mid-December.

Advertisement

“Dear Supercomputing Service Users,” the post begins (translated to English via Google). “Today, a bug in the backup program of the storage system caused an accident in which some files in / LARGE0 were lost. We have stopped processing the problem, but we may have lost nearly 100TB of files, and we are investigating the extent of the impact.”

Supercomputing differs from normal computing due largely to its speed and its ability to leverage multiple computer systems to process complex mathematical calculations. Its advantages over normal computing make it a valuable tool for research into a whole range of areas, including climate and atmospheric modeling, physics, vaccine science, and everything in between. Unfortunately, all of that is meaningless if your machine fails to work properly.