Remix.run Logo
Dylan16807 3 days ago

That's only true if your typical loss event loses one record. If you have a one in a million chance of an array failure taking out 10% of your production database, and otherwise have zero possibility of data loss, you also get 10^-7 losses per record.

And I wouldn't assume they meant that number to be per record in the first place.

asdfasgasdgasdg 3 days ago | parent [-]

I don't think anyone in history has ever achieved a true 10^-7 annual probability of any data loss incident. So they must have been making some kind of per record or per operation claim.

klodolph 3 days ago | parent [-]

I like to think that the true AFR for data is bounded by something like 10^-3, because maybe that’s close to the rate at which civilizations collapse. You have to use a kind of subtle definition to support 10^-7 or 10^-9 or 10^-11. Or maybe instead of “subtle definition”, you can call it a “whimsical, imaginary definition”. Depends on how cynical you are.

The way I would go is by saying that you multiply the number of objects by AFR, and that’s close to the actual losses on most years. You can then exclude WW3 and the late holocene extinction event from your consideration. Or simple bankruptcy, for that matter. If your employer is gone, you don’t care about its data any more.