▲ | pizzly 5 days ago | |||||||
This time is different. A fact right now is that software engineers now can orchestrate LLMs and agents to write software. The role of software engineers who do this is quality control, compliance, software architecture and some out of the box thinking for when LLMs do not cut it. What makes you think advances in AI wont take care of these tasks that LLMs do not do well currently? My point is once these tasks are taken care off a CS graduate won't be doing tasks that they learnt to do in their degrees. What people need to learn is how to think of customers needs in abstract ways and communicate this to AI and judge the output in a similar way someone judges a painting. | ||||||||
▲ | chii 5 days ago | parent [-] | |||||||
> CS graduate won't be doing tasks that they learnt to do in their degrees how is that different from the previous decade(s)? How often do you invert a redblack tree in your daily programming/engineering job? A CS degree is a degree for thinking computationally, using mathematics as a basis. It's got some science too (aka, use evidence and falsifiability to work out truths and not rely on pure intuition). It's got some critical thinking attached, if your university is any good at making undergraduate courses. A CS degree is not a boot camp, nor is it meant to make you ready for a job. While i did learn how to use git at uni, it was never required nor asked - it was purely my own curiosity, which a CS degree is meant to foster. | ||||||||
|