Remix.run Logo
TZubiri 4 hours ago

>and at the federal level, the legal liability is on the person who posts it, not who hosts the tool. this was a mistake that will likely be corrected over the next six years

[citation needed]

Historically hosts have always absolutely been responsible for the materials they host, see DMCA law, CSAM case law...

parl_match 4 hours ago | parent [-]

no offense but you completely misinterpreted what i wrote. i didnt say who hosts the materials, i said who hosts the tool. i didnt mention anything about the platform, which is a very relevant but separate party.

if you think i said otherwise, please quote me, thank you.

> Historically hosts have always absolutely been responsible for the materials they host,

[citation needed] :) go read up on section 230.

for example with dmca, liability arises if the host acts in bad faith, generates the infringing content itself, or fails to act on a takedown notice

that is quite some distance from "always absolutely". in fact, it's the whole point of 230

bluGill an hour ago | parent [-]

pedantically correct, but there is a good argument that if you host an AI tool that can easially be made to make child porn that no longer applies. a couple years ago when AI was new you could argue that you never thought anyone would use your tool to create child porn. However today it is clear some people are doing that and you need to prevent that.

Note that I'm not asking for perfection. However if someone does manage to create child porn (or any of a number of currently unspecified things - the list is likely to grow over the next few years), you need to show that you have a lot of protections in place and they did something hard to bypass them.