▲ | silon42 5 days ago | |||||||
Surely there are prompts on the "internet" that it will borrow from... | ||||||||
▲ | vineyardmike 5 days ago | parent [-] | |||||||
Definitionally no. Each LLM responds to prompts differently. The best prompts to model X will not be in the training data for model X. Yes, older prompts for older models can still be useful. But if you asked ChatGPT before GPT-5, you were getting a response from GPT-4 which had a knowledge cutoff around 2022, which is certainly not recent enough to find adequate prompts in the training data. There are also plenty of terrible prompts on the internet, so I still question a recent models ability to write meaningful prompts based on its training data. Prompts need to be tested for their use-case, and plenty of medium posts from self-proclaimed gurus and similar training data junk surely are not tested against your use case. Of course, the model is also not testing the prompt for you. | ||||||||
|