Remix.run Logo
nickpsecurity 5 hours ago

Some projects refuse for copyright reasons. Back when GPT4 was new, I dug into pretraining reports for nearly all models.

Every one (IIRC) was breaking copyrights by sharing 3rd-party works in data sets without permission. Some were trained on patent filings which makes patent infringement highly likely. Many breaking EULA's (contract law) by scraping them. Some outputs were verbatim reproductions of copyrighted works, too, which could get someoen sued if they published them.

So, I warned people to stay away from AI until (a) training on copyrighted/patented works was legal in all those circumstances, (b) the outputs had no liability, and (c) users of a model could know this by looking at the pretraining data. There's no GPT3- or Claude-level models produced that way.

On a personal level, I follow Jesus Christ who paid for my sins with His life. We're to be obedient to God's law. One is to submit to authority (aka don't break man's law). I don't know that I can use AI outputs if they were illegally trained or like fencing stolen goods. Another reason I want the pretraining to be legal either by mandate or using only permissible works.

Note: If your country is in the Berne Convention, it might apply to you, too.

hirako2000 4 hours ago | parent [-]

Not sure we need to invoke Jesus to agree with the liability concerns.

nickpsecurity 36 minutes ago | parent [-]

People's worldviews determine their morality. People often share them to motivate others to act morally. Laws like copyright have more inconsistency among moral systems than physical theft. People might be confused about how to respond.

Our country's morality declined following self-, money-, and pleasure-centered worldviews. Following Christ can reverse all of that. AI is currently steeped in destructive worldviews, esp top suppliers, but some readers might be open to or have other views. If they adopt them, more good will happen.