| ▲ | d00mB0t 5 days ago |
| People are crazy. |
|
| ▲ | dang 5 days ago | parent | next [-] |
| Maybe so, but please don't post unsubstantive comments to Hacker News. |
|
| ▲ | baal80spam 5 days ago | parent | prev | next [-] |
| In what sense? |
| |
| ▲ | d00mB0t 5 days ago | parent [-] | | Really? | | |
| ▲ | threatofrain 5 days ago | parent [-] | | You've already seen the fruits of your prompt and how far your "isn't is super obvious I don't need to explain myself" attitude is getting you. |
|
|
|
| ▲ | threatofrain 5 days ago | parent | prev | next [-] |
| This was performed on animals. What is a less crazy way to progress? Don't use animals, but humans instead? Only rely on pure theory up to the point of experimenting on humans? |
|
| ▲ | JaggerJo 5 days ago | parent | prev [-] |
| Yes, this is scary. |
| |
| ▲ | wfhrto 5 days ago | parent [-] | | Why? | | |
| ▲ | JaggerJo 5 days ago | parent [-] | | Because a LLM architecture seems way too fuzzy and unpredictable for something that should be reproducible. | | |
| ▲ | ACCount36 5 days ago | parent | next [-] | | Real world isn't "reproducible". If a robot can't handle weird and unexpected failures, it wouldn't survive out there. | |
| ▲ | SirMaster 5 days ago | parent | prev [-] | | I thought that was the temperature setting that does that? |
|
|
|