| ▲ | dorianmariecom 13 hours ago |
| you can use chatgpt to reverse the prompt |
|
| ▲ | XCSme 13 hours ago | parent | next [-] |
| Not sure if it's a joke, but I don't think LLM is a bijective function. |
| |
| ▲ | croemer 11 hours ago | parent [-] | | If you had all the token probabilities it would be bijective. There was a post about this here some time back. | | |
| ▲ | XCSme 9 hours ago | parent [-] | | Kind of, LLMs still use randomness when selecting tokens, so the same input can lead to multiple different outputs. |
|
|
|
| ▲ | small_scombrus 13 hours ago | parent | prev [-] |
| ChatGPT can generate you a sentence that plausibly looks like the prompt |
| |
| ▲ | llmslave2 6 hours ago | parent [-] | | Rather it estimates a potential prompt. I could do the same and it would be no more or less accurate. |
|