University performs routine backup, loses 77TB worth of research data instead

A supercomputer system in Kyoto University just lost 77 terabytes of research data due to an “accident” during a backup. Below is the statement posted by Kyoto University IT department’s Supercomputing Division, translated to English.

Dear Supercomputing Service Users

Today, a bug in the backup program of the storage system caused an accident in which some files in / LARGE0 were lost. We have stopped processing the problem, but we may have lost nearly 100TB of files, and we are investigating the extent of the impact.

We will contact those affected individually. We apologize for the inconvenience caused to all users.

Regarding this matter, from 17:50 to 19:00 on Thursday, December 16, 2021, we contacted the applicants of the target group by e-mail. The extent of the impact of the disappearance accident that occurred this time turned out to be as follows.

-Target file system: / LARGE0
-File deletion period: December 14, 2021 17:32-December 16, 2021 12:43
-Disappearance target file: December 3, 2021 17:32 or later, Files that were not updated
・ Lost file capacity: Approximately 77TB
・ Number of lost files: Approximately 34 million files
・ Number of affected groups: 14 groups (of which 4 groups cannot be restored by backup)

Kyoto University’s supercomputers come from Hewlett-Packard Cray and their storage system is from DataDirect Networks, a Californian company specialising in supercomputing storage solutions.

Kyoto University’s supercomputer configuration. Source: Kyoto University

They said that it affected approximately 34 million files, with a total of 14 groups affected. The specific files were not mentioned and the cause of the bug seems to still be unknown, but they did say that four of the groups cannot be restored. Hopefully, this means that the 10 other groups are in the process of recovery.

Supercomputing is not cheap to run either. It’s reported to cost hundreds of USD per hour to run the research, so this event probably set back the University greatly.

To get a scale of how much 77 terabytes of data is, let’s think of it in terms of more common terms. 1TB, or one terabyte, is equal to 1,024 gigabytes of data. That’s already multitudes bigger than the average phone storage capacity. The average song comes at about 3MB of data, which means that 77 terabytes could hold more than 25 million songs.

I think now is as good a time as any to say that backing up your data is extremely important. Whether you have a supercomputer, or just a small laptop, it never hurts to keep your data safe either in a separate hard drive or in the cloud (preferably both).

[ SOURCE, IMAGE SOURCE ]

Recent Posts

SoyaCincau Awards 2024: The Best Telcos of the Year

What a year it has been for Malaysian telcos. We saw mobile and internet plan…

52 minutes ago

Proton e.MAS 7: Proton’s first EV has achieved over 2,500 bookings so far

Pro-Net has announced that the Proton e.MAS 7 has achieved over 2,500 bookings so far.…

3 hours ago

MCMC: Telegram and WeChat have started social media licensing process

The Malaysian Communications and Multimedia Commission (MCMC) has announced that Telegram and Tencent (WeChat) have…

5 hours ago

Will Apple build its own search engine if it loses lucrative revenue share deal with Google Search?

The US Department of Justice is currently taking steps to break up Google over its…

10 hours ago

SoyaCincau Awards 2024: The Best and Worst in Tech of the Year

2024 has certainly been an interesting year, and to a certain extent, a dramatic one…

1 day ago

SoyaCincau Awards 2024: The Best Gadgets of the Year

We can't get enough of gadgets. These companion devices make our lives easier or more…

2 days ago

This website uses cookies.