▲ | benoau 3 days ago | |||||||||||||||||||||||||
I don't understand the data on ArchiveTeam's page but, it seems like they have 35 terabytes of data (286.56TiB)? It's a lot larger than I'd have thought. | ||||||||||||||||||||||||||
▲ | wtallis 3 days ago | parent [-] | |||||||||||||||||||||||||
FYI, "TiB" means terabytes with a base of 1024, ie. the units you'd typically use for measuring memory rather than the units you'd typically see drive vendors using. The factor of 8 you divided by only applies to units based on bits rather than bytes, and those units use "b" rather than "B", and are only used for capacity measurements when talking about individual memory dies (though they're normal for talking about interconnect speeds). Either way, we're talking about a dataset that fits easily in a 1U server with at most half of its SSD slots filled. | ||||||||||||||||||||||||||
|