| ▲ | linhns 11 hours ago |
| This is the way it should be. AI to speed up the understanding process, and one final evaluation without any help to cement the understanding. |
|
| ▲ | topherhunt 8 hours ago | parent | next [-] |
| I don't think the final evaluation is to "cement the understanding" so much as _verify_ that students have taken accountability for their own learning process. |
| |
| ▲ | aanet 8 hours ago | parent [-] | | ^ This This is what a student, who truly wants to learn rather than simply complete a course / certification, would do... Use AI tools to explain + learn, but not outsource the learning process itself to the tools. |
|
|
| ▲ | andsoitis 4 hours ago | parent | prev | next [-] |
| > AI to speed up the understanding process What’s your hypothesis of how AI can accelerate how your brain understands something? |
| |
| ▲ | wrs 2 hours ago | parent | next [-] | | I have some success with this method: I try to write an explanation of something, then ask the LLM to find problems with the explanation. Sometimes its response leads me to shore up my understanding. Other times its answer doesn’t make sense to me and we dig into why. Whether or not the LLM is correct, it helps me clarify my own learning. It’s basically rubber duck debugging for my brain. | |
| ▲ | allthetime 4 hours ago | parent | prev [-] | | Quick, easy access to explanations and examples on complex topics. In my case, learning enough trig and linear algebra to be useful in game engine programming / rendering has been made a lot easier / more efficient. The same way Google or Wikipedia enables learning. |
|
|
| ▲ | _345 5 hours ago | parent | prev | next [-] |
| I disagree. I think we should treat AI tools like calculators for the exam. |
|
| ▲ | 10 hours ago | parent | prev [-] |
| [deleted] |