▲ | nativeit 10 hours ago | ||||||||||||||||||||||
I assume it’s simply the lack of the inbuilt “universal client” that http enjoys, or that devs tend to have with ssh/scp. Not that such a client (even an automated/scripted CLI client) would be so difficult to setup, but then trackers are also necessary, and then the tooling for maintaining it all. Intuitively, none of this sounds impossible, or even necessarily that difficult apart from a few tricky spots. I think it’s more a matter of how large the demand is for frequent downloads of very large files/sets, which leads to a questions of reliability and seeding volume, all versus the effort involved to develop the tooling and integrate it with various RCS and file syncing services. Would something like Git LFS help here? I’m at the limit of my understanding for this. | |||||||||||||||||||||||
▲ | nativeit 10 hours ago | parent | next [-] | ||||||||||||||||||||||
I certainly take advantage of BitTorrent mirrors for downloading Debian ISOs, as they are generally MUCH faster. | |||||||||||||||||||||||
| |||||||||||||||||||||||
▲ | mschuster91 10 hours ago | parent | prev [-] | ||||||||||||||||||||||
Trackers haven't been necessary for well over a decade now thanks to DHT. |