Remix.run Logo
nullbyte808 a day ago

Man I need to get around to downloading the z-archive torrents before annas archive is taken down. If I eliminate large PDFs and non english books I think I can fit it on two 32 TB drives with BTRFS z-std compression max setting. https://annas-archive.org/torrents

mmooss a day ago | parent | next [-]

> eliminate large PDFs

How large? Isn't that going to result in an arbitrary filter of books? In other domains, large PDFs are due to PDF production errors, such as using color or needlessly high resolution, and not so much due to the volume of content - at least for text.

Llamamoe a day ago | parent | prev | next [-]

Depending on how important it is for you to maintain original quality, I have in the past had good luck with a combination of prerendering complex content, reducing the DPI and colour depth of images, and recombining them back into PDFs, depending on the file.

You could probably easily automate identifying different editions of the same content, and e.g. only keep an epub with small images, rather than the other 6 and 3 more PDFs as well.

cookiengineer a day ago | parent | prev | next [-]

Let me know of those efforts, I wanna have an English/German/French backup of the archive, too. But as you said HDDs and filesystems are the problem, really.

Maybe I'll have to build a torrent splitter or something, because the UIs of all torrent clients are just not built for that.

h4ck_th3_pl4n3t 19 hours ago | parent [-]

Sneed

brador a day ago | parent | prev [-]

Invert the list, start with the smallest, continue until full.