University performs routine backup, loses 77TB worth of research data instead

A supercomputer system in Kyoto University just lost 77 terabytes of research data due to an “accident” during a backup. Below is the statement posted by Kyoto University IT department’s Supercomputing Division, translated to English.

Dear Supercomputing Service Users

Today, a bug in the backup program of the storage system caused an accident in which some files in / LARGE0 were lost. We have stopped processing the problem, but we may have lost nearly 100TB of files, and we are investigating the extent of the impact.

We will contact those affected individually. We apologize for the inconvenience caused to all users.

Regarding this matter, from 17:50 to 19:00 on Thursday, December 16, 2021, we contacted the applicants of the target group by e-mail. The extent of the impact of the disappearance accident that occurred this time turned out to be as follows.

-Target file system: / LARGE0
-File deletion period: December 14, 2021 17:32-December 16, 2021 12:43
-Disappearance target file: December 3, 2021 17:32 or later, Files that were not updated
・ Lost file capacity: Approximately 77TB
・ Number of lost files: Approximately 34 million files
・ Number of affected groups: 14 groups (of which 4 groups cannot be restored by backup)

Kyoto University’s supercomputers come from Hewlett-Packard Cray and their storage system is from DataDirect Networks, a Californian company specialising in supercomputing storage solutions.

Kyoto University’s supercomputer configuration. Source: Kyoto University

They said that it affected approximately 34 million files, with a total of 14 groups affected. The specific files were not mentioned and the cause of the bug seems to still be unknown, but they did say that four of the groups cannot be restored. Hopefully, this means that the 10 other groups are in the process of recovery.

Supercomputing is not cheap to run either. It’s reported to cost hundreds of USD per hour to run the research, so this event probably set back the University greatly.

To get a scale of how much 77 terabytes of data is, let’s think of it in terms of more common terms. 1TB, or one terabyte, is equal to 1,024 gigabytes of data. That’s already multitudes bigger than the average phone storage capacity. The average song comes at about 3MB of data, which means that 77 terabytes could hold more than 25 million songs.

I think now is as good a time as any to say that backing up your data is extremely important. Whether you have a supercomputer, or just a small laptop, it never hurts to keep your data safe either in a separate hard drive or in the cloud (preferably both).

[ SOURCE, IMAGE SOURCE ]

Recent Posts

Puspakom backs officer as motorcycle trader ordered to pay RM80K over TikTok Live defamation

Puspakom Sdn Bhd (Puspakom) has reaffirmed its commitment to integrity and professional conduct following a…

11 hours ago

Huawei FusionSolar9.0 launches in Malaysia with AI-driven, grid-stabilising solar and energy storage solution

Huawei has launched its FusionSolar9.0 Smart PV & ESS solution in Malaysia, marking a shift…

12 hours ago

Hyundai Ioniq 6 N and Ioniq 5 N estimated price in Malaysia starts from RM450k

Hyundai Motor Malaysia (HMY) has officially opened the order books for its upcoming high-performance EV…

1 day ago

WCE now supports TNG eWallet PayDirect at all toll plazas

West Coast Expressway (WCE) is now PayDirect enabled and it is said to be the…

1 day ago

JomCharge x DBKL offers 50% off EV charging in Kepong this weekend

For this coming Labour Day holiday weekend, JomCharge x DBKL are offering 50% discount for…

2 days ago

Volvo offers Selekt certified used EVs from as little as RM153,000

Volvo Car Malaysia has released a limited batch of 100 Volvo Selekt Certified Used Cars…

2 days ago

This website uses cookies.