| ▲ | aanet 12 hours ago | |||||||||||||||||||||||||||||||||||||||||||||||||
> AI Policy for the AI Course “ Students are permitted to use AI assistants for all homework and programming assignments (especially as a reference for understanding any topics that seem confusing), but we strongly encourage you to complete your final submitted version of your assignment without AI. You cannot use any such assistants, or any external materials, during in-class evaluations (both the homework quizzes and the midterms and final). The rationale behind this policy is a simple one: AI can be extremely helpful as a learning tool (and to be clear, as an actual implementation tool), but over-reliance on these systems can currently be a detriment to learning in many cases. You absolutely need to learn how to code and do other tasks using AI tools, but turning in AI-generated solutions for the relatively short assignments we give you can (at least in our current experience) ultimately lead to substantially less understanding of the material. The choice is yours on assignments, but we believe that you will ultimately perform much better on the in-class quizzes and exams if you do work through your final submitted homework solutions yourself.” | ||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | ashertrockman 3 hours ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||||||||||||||
It feels downstream of CMU's "reasonable person principle". They know that people are going to use AI on their homework, but they trust that they want to learn and improve their skills -- and this is good advice for doing so. I'm somewhat biased because I was involved in a previous, related course. The important takeaways aren't really about gritty debugging of (possibly) large homework assignments, but the high-level overview you get in the process. AI assistance means you could cover more content and build larger, more realistic systems. An issue in the first iteration of Deep Learning Systems was that every homework built on the previous one, and errors could accumulate in subtle ways that we didn't anticipate. I spent a lot of time bisecting code to find these errors in office hours. It would have been just as educational to diagnose those errors with an LLM. Then students could spend more time implementing cool stuff in CUDA instead of hunting down a subtle bug in their 2d conv backwards pass under time pressure... But I think the breadth and depth of the course was phenomenal, and if courses can go further with AI assistance then it's great. This new class looks really cool, and Zico is a great teacher. | ||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | piker 10 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||||||||||||||
My money is on extraordinarily poor final exam results and/or cheating. | ||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | linhns 10 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||||||||||||||
This is the way it should be. AI to speed up the understanding process, and one final evaluation without any help to cement the understanding. | ||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||