| ▲ | theappsecguy 8 hours ago |
| It seems plenty obvious, but there's also scientific backing slowly catching up: https://www.media.mit.edu/publications/your-brain-on-chatgpt... |
|
| ▲ | fc417fc802 7 hours ago | parent | next [-] |
| It's not at all obvious because there's more than one way to go about it. Obviously entirely outsourcing is bad. Whereas working cooperatively seems highly beneficial to me. Google search has been getting progressively worse for technical topics for at least the past decade. Now suddenly they started providing a free tutor capable of custom tailoring graduate level explanations of technical topics for me on demand. The difference is night and day. |
| |
| ▲ | multjoy 7 hours ago | parent | next [-] | | How do you know that the explanations are free from error? | | |
| ▲ | charcircuit 7 hours ago | parent [-] | | You can still learn from sources that have errors. Many textbooks have mistakes and false information in them, but that didn't stop them from providing educational value to people. | | |
| ▲ | multjoy 7 hours ago | parent [-] | | We're talking about LLM's that are designed to be confidently incorrect. Accuracy is a side-effect. | | |
| ▲ | fc417fc802 7 hours ago | parent [-] | | When textbooks are incorrect it is also with great confidence. If you can't spot logical inconsistencies in the material were you actually learning or merely memorizing? |
|
|
| |
| ▲ | kelnos 7 hours ago | parent | prev [-] | | Sure there's more than one way to go about it, but what matters is how people typically do go about it. And certainly individuals can make their own decision to engage with an LLM in positive, self-thought-provoking ways, but it's still useful to understand how people generally do use them in the real world. |
|
|
| ▲ | wilg 7 hours ago | parent | prev [-] |
| That's about essay writing exclusively. |