Remix.run Logo
moebrowne 5 hours ago

This kind of thing can be mitigated by not publishing a page/download for every single branch, commit and diff in a repo.

Make only the HEAD of each branch available. Anyone who wants more detail has to clone it and view it with their favourite git client.

For example https://mitxela.com/projects/web-git-sum (https://git.mitxela.com/)

PaulDavisThe1st an hour ago | parent | next [-]

Alternatively, from the nginx config file for git.ardour.org:

   location ~ commit/* {
        return 404;
    }
petre 6 minutes ago | parent [-]

Why not just redirect them to a honeypot that prints a character every second and wastes their time?

Imustaskforhelp 5 hours ago | parent | prev [-]

I got another interesting idea from this and another comment but what if we combine this with ssh git clients/websites with the normal ability.

maybe something like https://ssheasy.com/ or similar could also be used? or maybe even a gotty/xterm instance which could automatically ssh/get a tui like interface.

I feel as if this would for all scrapers be enough?