| ▲ | pdpi 9 hours ago |
| How does that mesh with all the safe harbour provisions we've depended on to make the modern internet, though? |
|
| ▲ | mikeyouse 7 hours ago | parent | next [-] |
| The safe harbor provisions largely protect X from the content that the users post (within reason). Suddenly Grok/X were actually producing the objectionable content. Users were making gross requests and then an LLM owned by X, using X servers and X code would generate the illegal material and then post it to the website. The entity responsible is no longer done user but instead the company itself. |
| |
| ▲ | Altern4tiveAcc 2 hours ago | parent | next [-] | | So, if someone hosts an image editor as web app, are they liable if someone uses that editor to create CP? I honestly don't follow it. People creating nudes of others and using the Internet to distribute it can be sued for defamation, sure. I don't think the people hosting the service should be liable themselves, just like people hosting Tor nodes shouldn't be liable by what users of the Tor Network do. | |
| ▲ | luke5441 6 hours ago | parent | prev [-] | | Yes, and that was a very stupid product decision. They could have put the image generation into the post editor, shifting responsibility to the users. I'd guess Elon is responsible for that product decision. |
|
|
| ▲ | pjc50 4 hours ago | parent | prev | next [-] |
| Note that is a US law, not a French one. Also, safe harbor doesn't apply because this is published under the @grok handle! It's being published by X under one of their brand names, it's absurd to argue that they're unaware or not consenting to its publication. |
|
| ▲ | numpad0 4 hours ago | parent | prev | next [-] |
| It's not like the world benefited from safe harbor laws that much. Why don't just amend them so that algorithms that run on server side and platforms that recommend things are not eligible. |
| |
| ▲ | direwolf20 4 hours ago | parent [-] | | If you are thinking about section 230 it only applies to user–generated content, so not server–side AI or timeline algorithms. | | |
| ▲ | Altern4tiveAcc 2 hours ago | parent [-] | | So if a social network tool does the exact same thing, but uses the user's own GPU or NPU to generate the content instead, suddenly it's fine? | | |
| ▲ | direwolf20 2 hours ago | parent [-] | | If a user generates child porn on their own and uploads it to a social network, the social network is shielded from liability until they refuse to delete it. |
|
|
|
|
| ▲ | _trampeltier 7 hours ago | parent | prev | next [-] |
| Before a USER did create content. So the user was / is liable. Now a LLM owned by a company does create content. So the company is liable. |
| |
| ▲ | hbs18 6 hours ago | parent [-] | | I'm not trying to make excuses for Grok, but how exactly isn't the user creating the content? Grok doesn't have create images on its own volition, the user is still required to give it some input, therefore "creating" the content. | | |
| ▲ | luke5441 6 hours ago | parent | next [-] | | X is making it pretty clear that it is "Grok" posting those images and not the user. It is a separate posting that comes from an official account named "Grok". X has full control over what the official "Grok" account posts. There is no functionality for the users to review and approve "Grok" responses to their tweets. | |
| ▲ | mbesto 2 hours ago | parent | prev | next [-] | | Does an autonomous car drive the car from point A to point B or does the person who puts in the destination address drive the car? | |
| ▲ | _trampeltier 6 hours ago | parent | prev [-] | | Until now, webserver had just been like a post service. Grok is more like a CNC late. |
|
|
|
| ▲ | jazzyjackson 7 hours ago | parent | prev [-] |
| This might be an unpopular opinion but I always thought we might be better off without Web 2.0 where site owners aren’t held responsible for user content If you’re hosting content, why shouldn’t you be responsible, because your business model is impossible if you’re held to account for what’s happening on your premises? Without safe harbor, people might have to jump through the hoops of buying their own domain name, and hosting content themselves, would that be so bad? |
| |
| ▲ | direwolf20 4 hours ago | parent | next [-] | | Any app allowing any communication between two users would be illegal. | | |
| ▲ | expedition32 3 hours ago | parent [-] | | https://en.wikipedia.org/wiki/EncroChat You have to understand that Europe doesn't give a shit about techbro libertarians and their desire for a new Lamborghini. | | |
| ▲ | direwolf20 3 hours ago | parent [-] | | EncroChat was illegal because it was targeted at drug dealers, advertised for use in drug dealing. And they got evidence by texting "My associate got busted dealing drugs. Can you wipe his device?" and it was wiped. There's an actual knowledge component which is very important here. |
|
| |
| ▲ | pdpi 5 hours ago | parent | prev | next [-] | | What about webmail, IM, or any other sort of web-hosted communication? Do you honestly think it would be better if Google were responsible for whatever content gets sent to a gmail address? | | |
| ▲ | jazzyjackson 4 hours ago | parent [-] | | Messages are a little different than hosting public content but sure, a service provider should know its customers and stop doing business with any child sex traffickers planning parties over email. I would prefer 10,000 service providers to one big one that gets to read all the plaintext communication of the entire planet. | | |
| ▲ | pdpi 3 hours ago | parent | next [-] | | In a world where hosting services are responsible that way, their filtering would need to be even more sensitive than it is today, and plenty of places already produce unreasonable amounts of false positives. As it stands, I have a bunch of photos on my phone that would almost certainly get flagged by over-eager/overly sensitive child porn detection — close friends and family sending me photos of their kids at the beach. I've helped bathe and dress some of those kids. There's nothing nefarious about any of it, but it's close enough that services wouldn't take the risk, and that would be a loss to us all. | |
| ▲ | direwolf20 4 hours ago | parent | prev [-] | | They'd all have to read your emails to ensure you don't plan child sex parties. Whenever a keyword match comes up, your account will immediately be deleted. |
|
| |
| ▲ | terminalshort 4 hours ago | parent | prev [-] | | You know this site would not be possible without those protections, right? |
|