Remix.run Logo
conartist6 6 hours ago

God forbid we protect people from the theft machine

__MatrixMan__ 6 hours ago | parent | next [-]

There's a lot of problems with AI that need some carefully thought out regulation, but infringing on rights granted by IP law still isn't theft.

faidit 6 hours ago | parent | next [-]

It's theft. But not all IP theft, or theft in general, is morally equivalent. A poor person stealing a loaf of bread or pirating a movie they couldn't afford is just. A corrupt elite stealing poor farmers' food or stealing content from small struggling creators is not.

Workaccount2 44 minutes ago | parent | next [-]

>pirating a movie they couldn't afford is just

I wish this argument would die. It's so comically false, and is just used to allow people to pave over their cognitive dissonance with the real misfortunes of a small minority.

I am a millennial and rode the wave of piracy as much as the next 2006 computer nerd. It was never, ever, about not being able to afford these things, and always about how much you could get for free. For every one person who genuinely couldn't afford a movie, there were at least 1000 who just wanted it free.

__MatrixMan__ 24 minutes ago | parent [-]

Speak for yourself. For many more it's about being unwilling to support the development of tech that strips users of their ability to control the devices that they ostensibly own.

I happily pay for my media when there's a way to do so, without simultaneously supporting the emplacement of telescreens everywhere you look.

__MatrixMan__ 2 hours ago | parent | prev | next [-]

When you steal a loaf of bread, somebody's loaf of bread is missing. That's worlds apart from making an unauthorized copy of something.

jasonsb 6 hours ago | parent | prev [-]

Ask yourself: who owns the IP you're defending? It's not struggling artists, it's corporations and billionaires.

Stricter IP laws won't slow down closed-source models with armies of lawyers. They'll just kill open-source alternatives.

Cthulhu_ 5 hours ago | parent | next [-]

Under copyright laws, if HN's T's & C's didn't override it, anything I write and have written on HN is my IP. And the AI data hoarders used it to train their stuff.

jasonsb 5 hours ago | parent | next [-]

Let's meet in the middle: only allow AI data hoarders to train their stuff on your content if the model is open source. I can stand behind that.

philipwhiuk 2 hours ago | parent [-]

Uh no.

a) The model and the data

b) Why are we meeting in the middle?

SpicyLemonZest an hour ago | parent | prev [-]

Calling a HN comment “intellectual property” is like calling a table saw in your garage “capital”. There are specific regulatory contexts where it might be somewhat accurate, but it’s so different from the normal case that none of our normal intuitions about it apply.

For example, copyright makes it illegal to take an entire book and republish it with minor tweaks. But for something short like an HN comment this doesn’t apply; copyright always permits you to copy someone’s ideas, even when that requires using many of the same words.

Workaccount2 38 minutes ago | parent [-]

People seem to either intentionally or unintentionally (large from being taught by the intentional ones), to not know what training an AI involves.

I think most people think that AI training means copying vast troves of data onto ChatGPT hard drives for the model to actively reference.

fzeroracer 5 hours ago | parent | prev | next [-]

How do you expect open source alternatives to exist when they cannot enforce how you use their IP? Open source licenses exist and are enforced under IP law. This is part of the reason why AI companies have been pushing hard for IP reform because they to decimate IP laws for thee but not for me.

faidit 5 hours ago | parent | prev [-]

I never advocated "stricter IP laws". I would however point out the contradiction between current IP laws being enforced against kids using BitTorrent while unenforced against billionaires and their AI ventures, despite them committing IP theft on a far grander scale.

terminalshort 2 hours ago | parent | prev | next [-]

And it doesn't even infringe on IP rights.

jasonsb 6 hours ago | parent | prev [-]

Agreed. Regulate AI? Sure, though I have zero faith politicians will do it competently. But more IP protection? Hard pass. I'd rather abolish patents.

TheAceOfHearts 6 hours ago | parent [-]

I think one of the key issues is that most of these discussions are happening at too high of an abstraction level. Could you give some specific examples of AI regulations that you think would be good? If we actually start elevating and refining key talking points that define the direction in which we want things to go, they will actually have a chance to spread.

Speaking of IP, I'd like to see some major copyright reform. Maybe bring down the duration to the original 14 years, and expand fair use. When copyright lasts so long, one of the key components for cultural evolution and iteration is severely hampered and slowed down. The rate at which culture evolves is going to continue accelerating, and we need our laws to catch up and adapt.

jasonsb 5 hours ago | parent | next [-]

> Could you give some specific examples of AI regulations that you think would be good?

Sure, I can give you some examples:

- deceiving someone into thinking they're talking to a human should be a felony (prison time, no exceptions for corporations)

- ban government/law-enforcement use of AI for surveillance, predictive policing or automated sentencing

- no closed-source AI allowed in any public institution (schools, hospitals, courts...)

- no selling or renting paid AI products to anyone under 16 (free tools only)

__MatrixMan__ 2 hours ago | parent | next [-]

I like where you're going. How about we just ban closed source software of any kind from public institutions?

j16sdiz 4 hours ago | parent | prev [-]

> - deceiving someone into thinking they're talking to a human

This is gonna be as enforceable as the CANSPAM act. (i.e. you will get a few big cases, but it's nothing compared to the overall situation)

How do you proof it in court? Do we need to record all private conversations?

bcrosby95 an hour ago | parent [-]

If you think spam is bad now imagine if trillion dollar corporations could do it. Just because something isn't perfect doesn't mean it doesn't help.

patrick451 3 hours ago | parent | prev [-]

> Could you give some specific examples of AI regulations that you think would be good?

AI companies need to be held liable for the outputs of their models. Giving bad medical advice, buggy code etc should be something they can be sued for.

6 hours ago | parent | prev [-]
[deleted]