- cross-posted to:
- selfhosted
- technology
- cross-posted to:
- selfhosted
- technology
- Google Cloud accidentally deleted UniSuper’s account and backups, causing a major data loss and downtime for the company.
- UniSuper was able to recover data from backups with a different provider after the incident.
- The incident highlighted the importance of having safeguards in place for cloud service providers to prevent such catastrophic events from occurring.
As the saying goes: if you only have one backup you have zero backups.
How the fuck does Google of all companies manage to accidentally delete that‽
If this is the thing I heard of a few days ago then google had multiple backups on different sites but they managed to delete all of them
I guess they weren’t paying quite enough to have offline backups? I believe financial institutions can keep stuff stored in caves (think records of all the mortgages a bank wants to be repaid for - data loss isn’t an option).
From the sounds of it, they did, since they were able to recover the data from elsewhere.
They just lost the data they kept and stored with Google.
I’m betting job cuts and someone was in a hurry
Backups all tied to the same Google account that got mistakenly terminated, and automation did the rest?
It didn’t matter that they might have had backups on different services, since it was all centralised through Google, it was all blown away simultaneously.
UniSuper was able to recover data from backups with a different provider after the incident.
It’s weird that backups got deleted immediately. I would imagine they get marked for deletion but really deleted something like a month later to prevent this kind of issue.
That’s when accounts are closed or payments missed, I think in this case they just deleted the sub itself which just bypassed everything for instant deletion.
I don’t see why it matters that it was a subscription. Anything which deletes data should be a soft delete.
Sometimes it has to be a hard delete to comply with a user’s request to remove data.
My first job was in a Big Iron shop in the late 80’s, where I was in charge of backups. We kept Three sets of backups, on two different media, one on hand, one in a different location in the main building, in a water and fireproof safe, and one offsite. We had a major failure one day, and had to do a restore.
Both inhouse copies failed to restore. Thankfully the offsite copy worked. We were in panic. That taught me to keep all my important data on three sets. As the old saying goes: Data loss is not an if question, but a when question. Also, remember that “the cloud” simply means someone else’s remote servers over which you have no control.
And had you ever tested the restore process?
In a big iron shop?everything gets tested, dry run, etc, but shit happens, hence backups
Everything is tied to the subscriptions, they deleted the sub and that automatically deleted all backups.
That sounds like a pretty trashy backup scheme. I don’t care what your subscription status is I’m keeping those backups until retension’s over.
Very stupid.
AWS has a holding period after account deletion where nothing is actually deleted, just inaccessible and access can be regained without data loss.
Since first hearing about this I’m wondering how TF Google Cloud doesn’t have a similar SOP.
Second week with this story.
Yeah it’s getting old.
reading about google fucking up and bringing other corporations down with it never gets old.
Sudar the creep will keep your nudes around for rest of eternity but can’t provide proper enterprise product…
Cheers
Reading about Google fucking up got old ten years ago.
I was gonna ask if this happened again lol
“an unprecedented sequence of events”
Yeah? It was, what’s your point?
It sounds similar to “a unscheduled pen test” and other corporate speak
These situations are almost always self-inflicted. If someone else hacked Google Cloud this badly then you’d likely have heard it from them first. And they probably would have done something significantly more destructive if their goal was harming Google reputation.
I just found the phasing to be kind of funny
“rapid unscheduled disassembly” 🙄
Boeing is opening the doors to the
futureplane mid flight
Sounds really dramatic in a news item though. Click bait. :)
But yeah, I recently moved away from these cloud services and have a Nas at home now. Only encrypted backups in cloud. Because fuck Google.
“Unprecedented” is kinda hot right now. Tries to mitigate too much blame being heaped on: “obviously we prepare for the usual and even the unexpected, but this has literally never happened before (give us another shot pls)”.
So it’s interesting for the news that it takes on a different context when said breathlessly: “UNPRECEDENTED failure!”
They’re talking like it’s some global celestial event
Unprecedented only means there’s no precedent. This just hasn’t happened before at this scale.
I was only commenting on the phrasing. ‘unprecedented cloud event’ sounds like some global scale meteorological event.
The headline does say “Customer account” as in singular.
Follow the 3-2-1 rule for your important data, ideally 4-3-2 or better. Remember, if you only have one copy of your data, you actually have zero copies of your data.
what are these rules? i genuinely am not aware of them.
3 separate backups on 2 different media (ie 2 backups on 2 separate HDDs plus one on DVDs) with At least 1 offsite (ie a satellite office or your parents house for personal stuff)
cheers mate.
If you didn’t put Google’s name in there I would’ve assumed a different company facepalming. Hint: it’s the one whose name sounds like ‘unsure’.
Always follow the 3-2-1 rule, Google. Always!
At the end of the day… Cloud storage is just using someone else’s computer.
Imagine if YouTube lost all its videos