| ▲ | fluidcruft 8 months ago | ||||||||||||||||||||||
But... the prompt neither forbade Indiana Jones nor did it describe something that excluded Indiana Jones. If we were playing Charades, just about anyone would have guessed you were describing Indiana Jones. If you gave a street artist the same prompt, you'd probably get something similar unless you specified something like "... but something different than Indiana Jones". | |||||||||||||||||||||||
| ▲ | 9dev 8 months ago | parent | next [-] | ||||||||||||||||||||||
And… that is called overfitting. If you show the model values for y, but they are 2 in 99% of all cases, it’s likely going to yield 2 when asked about the value of y, even if the prompt didn’t specify or forbid 2 specifically. | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | darkwater 8 months ago | parent | prev [-] | ||||||||||||||||||||||
The nice thing about humans is that not every single human being read almost every content present on the Internet. So yeah, a certain group of people would draw or think of Indiana Jones with that prompt, but not everyone. Maybe we will have different models with different trainings/settings that permits this kind of freedom, although I doubt it will be the commercial ones. | |||||||||||||||||||||||