| ▲ | vkou 20 hours ago |
| > I'm ok with the model knowing what Indiana Jones or the Predator looks like with well remembered details, ClosedAI doesn't seem to be OK with it, because they are explicitly censoring characters of more popular IPs. Presumably as a fig leaf against accusations of theft. |
|
| ▲ | red75prime 15 hours ago | parent | next [-] |
| If you define feeding of copyrighted material into a non-human learning machine as theft, then sure. Anything that mitigates legal consequences will be a fig leaf. The question is "should we define it as such?" |
| |
| ▲ | reginald78 10 hours ago | parent | next [-] | | The fact that they have guardrails to try and prevent it means OpenAI themselves thinks it is at least shady or outright illegal in someway. Otherwise why bother? | |
| ▲ | vkou 14 hours ago | parent | prev [-] | | If a graphics design company was using human artists to do the same thing that OpenAI is, they'd be sued out of existence. But because a computer, and not a human does it, they get to launder their responsibility. | | |
| ▲ | red75prime 13 hours ago | parent [-] | | Doing what? Telling their artists to create what they want regardless of copyright and then filtering the output? For humans it doesn't make sense because we have generation and filtering in a single package. | | |
| ▲ | vkou 6 hours ago | parent [-] | | In this case the output wasn't filtered. They are just producing images of Harrison Ford, and I don't think they are allowed to use his likeness in that way. |
|
|
|
|
| ▲ | Lerc 8 hours ago | parent | prev [-] |
| There is a difference between knowing what something looks like and generating an image of that thing. |