University performs routine backup, loses 77TB worth of research data instead

A supercomputer system in Kyoto University just lost 77 terabytes of research data due to an “accident” during a backup. Below is the statement posted by Kyoto University IT department’s Supercomputing Division, translated to English.

Dear Supercomputing Service Users

Today, a bug in the backup program of the storage system caused an accident in which some files in / LARGE0 were lost. We have stopped processing the problem, but we may have lost nearly 100TB of files, and we are investigating the extent of the impact.

We will contact those affected individually. We apologize for the inconvenience caused to all users.

Regarding this matter, from 17:50 to 19:00 on Thursday, December 16, 2021, we contacted the applicants of the target group by e-mail. The extent of the impact of the disappearance accident that occurred this time turned out to be as follows.

-Target file system: / LARGE0
-File deletion period: December 14, 2021 17:32-December 16, 2021 12:43
-Disappearance target file: December 3, 2021 17:32 or later, Files that were not updated
・ Lost file capacity: Approximately 77TB
・ Number of lost files: Approximately 34 million files
・ Number of affected groups: 14 groups (of which 4 groups cannot be restored by backup)

Kyoto University’s supercomputers come from Hewlett-Packard Cray and their storage system is from DataDirect Networks, a Californian company specialising in supercomputing storage solutions.

Kyoto University’s supercomputer configuration. Source: Kyoto University

They said that it affected approximately 34 million files, with a total of 14 groups affected. The specific files were not mentioned and the cause of the bug seems to still be unknown, but they did say that four of the groups cannot be restored. Hopefully, this means that the 10 other groups are in the process of recovery.

Supercomputing is not cheap to run either. It’s reported to cost hundreds of USD per hour to run the research, so this event probably set back the University greatly.

To get a scale of how much 77 terabytes of data is, let’s think of it in terms of more common terms. 1TB, or one terabyte, is equal to 1,024 gigabytes of data. That’s already multitudes bigger than the average phone storage capacity. The average song comes at about 3MB of data, which means that 77 terabytes could hold more than 25 million songs.

I think now is as good a time as any to say that backing up your data is extremely important. Whether you have a supercomputer, or just a small laptop, it never hurts to keep your data safe either in a separate hard drive or in the cloud (preferably both).

[ SOURCE, IMAGE SOURCE ]

Recent Posts

Realme C85: World record-breaking “ultra waterproof” phone, but not for the reason you think

Realme has just launched a new budget-oriented mid-range smartphone in Malaysia, the Realme C85 5G.…

44 minutes ago

sooka’s Gilerrr Streaming Challenge Draws 273 Participants, Clinches Malaysia Book of Records Title

This post is brought to you by sooka. sooka pulled a lively crowd to Pavilion…

4 hours ago

Infinix teams up with Pininfarina for future smartphone designs. Note 60 Ultra launching first in 2026

Infinix has just announced its strategic partnership with Pininfarina for its upcoming flagship smartphones, revealed…

1 day ago

Your Proton car can soon be controlled from a Huawei smartwatch

During Proton's Tech Showcase, the national carmaker has also highlighted its digital and connected automotive…

1 day ago

MoF Inc triggers Put Option for DNB shares: CelcomDigi, Maxis and YTL Power to fork out RM328 mil each

Digital Nasional Berhad (DNB), Malaysia's first 5G network, will soon transform into a fully private…

1 day ago

TNB Electron deploys 240kW DC charger in Bagan Serai, free charging for limited time

TNB Electron continues to deploy more EV charging infrastructure in underserved areas and the latest…

2 days ago

This website uses cookies.