Remix.run Logo
nativeit 10 hours ago

I assume it’s simply the lack of the inbuilt “universal client” that http enjoys, or that devs tend to have with ssh/scp. Not that such a client (even an automated/scripted CLI client) would be so difficult to setup, but then trackers are also necessary, and then the tooling for maintaining it all. Intuitively, none of this sounds impossible, or even necessarily that difficult apart from a few tricky spots.

I think it’s more a matter of how large the demand is for frequent downloads of very large files/sets, which leads to a questions of reliability and seeding volume, all versus the effort involved to develop the tooling and integrate it with various RCS and file syncing services.

Would something like Git LFS help here? I’m at the limit of my understanding for this.

nativeit 10 hours ago | parent | next [-]

I certainly take advantage of BitTorrent mirrors for downloading Debian ISOs, as they are generally MUCH faster.

nopurpose 9 hours ago | parent | next [-]

All Linux ISOs collectors in the world wholeheartedly agree.

Sesse__ 7 hours ago | parent | prev [-]

Are you serious? Most Debian ISO mirrors I've used have 10gig connectivity and usually push a gigabit or two fairly easily. BitTorrent is generally a lot slower than that (it's a pretty terrible protocol for connecting you to actually fast peers and getting stuff quickly from them).

queenkjuul 2 hours ago | parent [-]

I've definitely seen higher speeds with BitTorrent, pretty easily maxing out my gbe nics, but I'm not downloading Debian images specifically with much frequency.

mschuster91 10 hours ago | parent | prev [-]

Trackers haven't been necessary for well over a decade now thanks to DHT.