Remix.run Logo
taylorallred 3 days ago

One thing that has always worried me about AI coding is the loss of practice. To me, writing the code by hand (including the boilerplate and things I've done hundreds of times) is the equivalent of Mr. Miyagi's paint-the-fence. Each iteration gets it deeper into your brain and having these patterns as a part of you makes you much more effective at making higher-level design decisions.

biophysboy 3 days ago | parent | next [-]

A retort you often hear is that prior technologies, like writing or the printing press, may have stunted our calligraphy or rhetorical skills, but they did not stunt our capacity to think. If anything, they magnified it! Basically, the whole Steve Jobs' bicycle-for-the-mind idea.

My issue with applying this reasoning to AI is that prior technologies addressed bottlenecks in distribution, whereas this more directly attacks the creative process itself. Stratechery has a great post on this, where he argues that AI is attempting to remove the "substantiation" bottleneck in idea generation.

Doing this for creative tasks is fine ONLY IF it does not inhibit your own creative development. Humans only have so much self-control/self-awareness

arscan 3 days ago | parent | next [-]

I’ve been thinking of LLMs a bit like a credit-card-for-the-mind, it reduces friction to accessing and enabling your own expertise. But if you don’t have that expertise already, be careful, eventually it’ll catch up to you and a big bill will be due.

bluefirebrand 3 days ago | parent | next [-]

Unfortunately a lot of people are basically just hoping that by the time the big bill is due, they have cashed out and left the bill on someone else

I also think that even with expertise, people relying too much on AI are going to erode their expertise

If you can lift heavy weights, but start to use machines to lift instead, your muscles will shrink and you won't be able to lift as much

The brain is a muscle it must be exercised to keep it strong too

danielbln 2 days ago | parent [-]

We are in the business of automation, this is also automation. What good is doing the manual work if automation provides good enough results. I increasingly consider the code an implementation detail and spend most of my thinking one abstraction level higher. It's not always there yet but it's really often good enough to great, given the right oversight.

bluefirebrand 2 days ago | parent [-]

Code is not just an implementation detail, I wish people would knock it off with that idea

It would be like saying "roofs are just an implementation detail of building a house". Fine, but you build the roof wrong your house is going to suck

danielbln 2 days ago | parent [-]

I'm tasking a contractor to lay the roof tiles and just give them my specifications. How they lay the tiles, I don't care, as long as it passes inspection afterwards and conforms to my spec.

saltcured 3 days ago | parent | prev [-]

I think this phrase is beautiful

assuming you were referencing "bicycle for the mind"

margalabargala 3 days ago | parent | prev | next [-]

I still don't think that's true. It's just the medium that changes here.

A better analogy than the printing press, would be synthesizers. Did their existence kill classical music? Does modern electronic music have less creativity put into it than pre-synth music? Or did it simply open up a new world for more people to express their creativity in new and different ways?

"Code" isn't the form our thinking must take. To say that we all will stunt our thinking by using natural language to write code, is to say we already stunted our thinking by using code and compilers to write assembly.

miltonlost 3 days ago | parent | next [-]

AI for writing is not like a synthesizer. It's a player piano, and people act as if they're musicians now.

margalabargala 3 days ago | parent [-]

I totally disagree.

Importing an external library into your code is like using a player piano.

Heck, writing in a language you didn't personally invent is like using a player piano.

Using AI doesn't make someone "not a programmer" in any new way that hasn't already been goalpost-moved around before.

caconym_ 3 days ago | parent [-]

> Heck, writing in a language you didn't personally invent is like using a player piano.

Do you actually believe that any arbitrary act of writing is necessarily equivalent in creative terms to flipping a switch on a machine you didn't build and listening to it play music you didn't write? Because that's frankly insane.

margalabargala 3 days ago | parent [-]

Yes, the language comment was hyperbolic.

Importing a library someone else wrote basically is flipping a switch and getting software behavior you didn't write.

Frankly I don't see a difference in creative terms between writing an app that does <thing> that relies heavily on importing already-written libraries for a lot of the heavy lifting, and describing what you have in mind for <thing> to an LLM in sufficient detail that it is able to create a working version of whatever it is.

Actually can see an argument that both of those are also potentially equal, in creative terms, to writing the whole thing from scratch. If the author's goal was to write beautiful software, that's one thing, but if the author's goal is to create <thing>? Then the existence and characteristics of <thing> is the measure of their creativity, not the method of construction.

caconym_ 3 days ago | parent [-]

The real question is what you yourself are adding to the creative process. Importing libraries into a moderately complex piece of software you wrote yourself is analogous to including genai-produced elements in a collage assembled by hand, with additional elements added (e.g. painted) on top also by hand. But just passing off the output of some genai system as your own work is like forking somebody else's library on Github and claiming to be the author of it.

> If the author's goal was to write beautiful software, that's one thing, but if the author's goal is to create <thing>? Then the existence and characteristics of <thing> is the measure of their creativity, not the method of construction.

What you are missing is that the nature of a piece of art (for a very loose definition of 'art') made by humans is defined as much by the process of creating it (and by developing your skills as an artist to the point where that act of creation is possible) as by whatever ideas you had about it before you started working on it. Vastly more so, generally, if you go back to the beginning of your journey as an artist.

If you just use genai, you are not taking that journey, and the product of the creative process is not a product of your creative process. Therefore, said product is not descended from your initial idea in the same way it would have been if you'd done the work yourself.

biophysboy 3 days ago | parent | prev | next [-]

That's why I made a caveat that AI is only bad if it limits your creative development. Eno took synthesizers to places music never went. I'd love for people to do the same with LLMs. I do think they have more danger than synthesizers had for music, specifically because of their flexibility and competence.

leptons 3 days ago | parent | prev [-]

A synthesizer is just as useless as a violin without someone to play it.

You could hook both of those things up to servos and make a machine do it, but it's the notes being played that are where creativity comes in.

I've liked some AI generated music, and it even fooled me for a little while but only up to a point, because after a few minutes it just feels very "canned". I doubt that will change, because most good music is based on human emotion and experience, something an "AI" is not likely to understand in our lifetimes.

croes 3 days ago | parent | prev | next [-]

But AI also does the thinking.

So if the printing press stunted our writing what will the thinking press stunt.

https://gizmodo.com/microsoft-study-finds-relying-on-ai-kill...

justlikereddit 3 days ago | parent [-]

Worst promise of AI isn't subverting thinking of those who try to think.

It's being an executor for those who doesn't think but can make up rules and laws.

cess11 3 days ago | parent | prev | next [-]

Bad examples. Computer keyboards killed handwriting, the Internet killed rhetoric.

emehex 3 days ago | parent | prev [-]

Counter-counter-point: handwriting > typing for remembering things (https://www.glamour.com/story/typing-memory)

yoyohello13 3 days ago | parent | prev | next [-]

There are many time when I’ll mull over a problem in my head at night or in the shower. I kind of “write the code” in my head. I find it very useful sometimes. I don’t think it would be possible if I didn’t have the language constructs ingrained in my head.

Jonovono 3 days ago | parent [-]

I find it do this more now with AI than before.

yoyohello13 3 days ago | parent | next [-]

What do you mean? Are you working on more projects, or more engaged in ideation? Not sure how AI would cause you to write code in your head more while away from the computer. Most people seem to have a harder time writing code without AI the more they use it. The whole “copilot pause” phenomenon, etc.

Jonovono 3 days ago | parent [-]

Since my job now is primarily a reviewer of (AI) code I find:

1) I'm able to work on more projects

2) The things I am able to work on are much larger in scope and ambition

3) I like to mentally build the idea in my head so I have something to review the generated code against. Either to guide the model in the direction I am thinking or get surprised and learn about alternate approaches.

It's also like you say, in the process, a lot more iterative and ideation is able to happen with Ai. So early on i'll ask it for examples in x language using y approach. I'll sit on that for a night and throw around tangentially related approaches in my head and then riff on what I came up with the next day

bluefirebrand 3 days ago | parent | prev [-]

Do you? Or do you spend more time thinking about how to write prompts?

Jonovono 3 days ago | parent [-]

My prompts are very lazy, off the cuff. Maybe I would see better gains if I spent some time on them, not sure.

donsupreme 3 days ago | parent | prev | next [-]

Many analog to this IRL:

1) I can't remember the last time I write something meaningfully long with an actual pen/pencil. My handwriting is beyond horrible.

2) I can't no longer find my way driving without a GPS. Reading a map? lol

lucianbr 3 days ago | parent | next [-]

If you were a professional writer or driver, it might make sense to be able to do those things. You could still do without them, but they might make you better in your trade. For example, I sometimes drive with GPS on in areas I know very well, and the computer provided guidance is not the best.

Zacharias030 3 days ago | parent [-]

I think the sweet spot is always keeping north up on the GPS. Yes it takes some getting used to, but you will learn the lay of the land.

0x457 3 days ago | parent | prev | next [-]

> I can't remember the last time I write something meaningfully long with an actual pen/pencil. My handwriting is beyond horrible.

That's a skill that depends on motor functions of your hands, so it makes sense that it degrades with lack of practice.

> I can't no longer find my way driving without a GPS. Reading a map? lol

Pretty sure what that actually means in most cases is "I can go from A to B without GPS, but the route will be suboptimal, and I will have to keep more attention to street names"

If you ever had a joy of printing map quest or using a paper map, I'm sure you still these people skill can do, maybe it will take them longer. I'm good at reading mall maps tho.

yoyohello13 3 days ago | parent | next [-]

Mental skills (just like motor skills) also degrade with time. I can’t remember how to do an integral by hand anymore. Although re-learning would probably be faster if I looked it up.

0x457 3 days ago | parent [-]

Please don't think of this as moving the goal post, but back to maps and GPS: you're still doing the navigation (i.e. actual change in direction), just doing it with different tools.

The last time I dealt with integrals by hand or not was before node.js was announced (just a point in time).

Sure, you can probably forget a mental skill from lack of practicing it, but in my personal experience it takes A LOT longer than for a motor skill.

Again, you're still writing code, but with a different tool.

jazzyjackson 3 days ago | parent | prev [-]

> I'm sure you still these people skill can do,

I wonder if you’d make this kind of mistake writing by hand

0x457 2 days ago | parent [-]

I would, it's an ADHD thing for me.

danphilibin 3 days ago | parent | prev | next [-]

On 2) I've combatted this since long before AI by playing a game of "get home without using GPS" whenever I drive somewhere. I've definitely maintained a very good directional sense by doing this - it forces you to think about main roads, landmarks, and cardinal directions.

stronglikedan 3 days ago | parent | prev | next [-]

I couldn't imagine operating without a paper and pen. I've used just about every note taking app available, but nothing commits anything to memory like writing it down. Of course, important writings go into the note app, but I save time inputting now and searching later if I've written things down first.

goda90 3 days ago | parent | prev | next [-]

I don't like having location turned on on my phone, so it's a big motivator to see if I can look at the map and determine where I need to go in relation to familiar streets and landmarks. It's definitely not "figure out a road trip with just a paper map" level wayfinding, but it helps for learning local stuff.

eastbound 3 days ago | parent | prev [-]

> find my way driving without a GPS. Reading a map? lol

Most people would still be able to. But we fantasize about the usefulness of maps. I remember myself on the Paris circular highway (at the time 110km/h, not 50km/h like today), the map on the driving wheel, super dangerous. You say you’d miss GPS features on a paper map, but back then we had the same problems: It didn’t speak, didn’t have the blinking position, didn’t tell you which lane to take, it simplified details to the point of losing you…

You won’t become less clever with AI: You already have Youtube for that. You’ll just become augmented.

apetresc 3 days ago | parent [-]

Nobody is debating the usefulness of GPS versus a paper map. Obviously the paper map was worse. The point is precisely that because GPS is so much better than maps, we delegate all direction-finding to the GPS and completely lose our ability to navigate without it.

A 1990s driver without a map is probably a lot more capable of muddling their way to the destination than a 2020s driver without their GPS.

That's the right analogy. Whether you think it matters how well people can navigate without GPS in a world of ubiquitous phones (and, to bring the analogy back, how well people will be able to program without an LLM after a generation or two of ubiquitous AI) is, of course, a judgment call.

okr 3 days ago | parent | prev | next [-]

Soldering transistors by hand was a thing too, once. But these days, i am not sure, if people wanna keep up anymore. Many trillions of transistors later. :)

I like this zooming in and zooming out, mentally. At some point i can zoom out another level. I miss coding. While i still code a lot.

cmiles74 3 days ago | parent | next [-]

I think this is a fundamentally different pursuit. The intellectual part was figuring out where the transistors would go, that's the part that took the thinking. Letting a machine do it just let's you test quicker and move onto the next step. Although, of course, if you only solder your transistors by hand once a year you aren't likely to be very good at it. ;-)

People say the same thing about code but there's been a big conflation between "writing code" and "thinking about the problem". Way too often people are trying to get AI to "think about the problem" instead of simply writing the code.

For me, personally, the writing the code part goes pretty quick. I'm not convinced that's my bottleneck.

bGl2YW5j 3 days ago | parent [-]

Great point about the conflation. This makes me realise: for me, writing code is often a big part of thinking through the problem. So it’s no wonder that I’ve found LLMs to be least effective when I cede control before having written a little code myself, ie having worked through the problem a bit.

lucianbr 3 days ago | parent | prev | next [-]

There are definitely people who solder transistors by hand still. Though most not for a living. I wonder how the venn diagram looks together with the set of people designing circuits that eventually get built by machines. Maybe not as disjoint as you first imagine.

kevindamm 3 days ago | parent [-]

Depending on the scale of the run and innovation of the tech, it's not unusual to see a founder digging into test-run QA issues with a multimeter and soldering iron, or perhaps a serial port and software debugger. But more often in China than the US these days, or China-US partnerships. And the hobbyist Makers and home innovators still solder together one-offs a lot, that's worldwide. Speakerbox builders do a lot of projects with a little soldering.

I dare say there are more individuals who have soldered something today than there were 100 years ago.

Ekaros 3 days ago | parent | prev [-]

If you start designing circuits with LLM (can they even do that yet?) Will you ever learn to do it yourself or fix it when it goes wrong and magic smoke comes out after robot made it for you?

ozten 3 days ago | parent | prev | next [-]

"Every augmentation is an amputation" -- Marshall McLuhan

danielvaughn 3 days ago | parent | next [-]

Well there goes a quote that will be stuck in my head for the rest of my life.

jxf 3 days ago | parent | prev [-]

Q: Where did he say this? I think this may be apocryphal (or a paraphrasing?) as I couldn't find a direct quote.

ozten 3 days ago | parent | next [-]

True. It isn't literally present as that sentance in Understanding Media: The Extensions of Man (1964), but is a summarization. Amputation is mentioned 15 times and augmentation twice.

The concept that "every augmentation is an amputation" is best captured in Chapter 4, "THE GADGET LOVER: Narcissus as Narcosis." The chapter explains that any extension of ourselves is a form of "autoamputation" that numbs our senses.

Technology as "Autoamputation": The text introduces research that regards all extensions of ourselves as attempts by the body to maintain equilibrium against irritation. This process is described as a kind of self-amputation. The central nervous system protects itself from overstimulation by isolating or "amputating" the offending function. This theory explains "why man is impelled to extend various parts of his body by a kind of autoamputation".

The Wheel as an Example: The book uses the wheel as an example of this process. The pressure of new burdens led to the extension, or "'amputation,'" of the foot from the body into the form of the wheel. This amplification of a single function is made bearable only through a "numbness or blocking of perception".

etc

akprasad 3 days ago | parent | prev [-]

I can't find an exact quote either, but AFAICT he wrote extensively on extensions and amputations, though perhaps less concisely.

add-sub-mul-div 3 days ago | parent | prev | next [-]

Generalize this to: what's it going to look like in ten years when the majority of our society has outsourced general thinking and creativity rather than practicing it?

sim7c00 3 days ago | parent | next [-]

i already see only electric bikes and chatGPT answers from ppl perpetually glued to their phone screens... soon no one can walk and everyone has a red and green button on their toilet-tv-lounge-chair watching the latest episode of Ow my b**! ;D

szundi 3 days ago | parent | prev [-]

[dead]

beefnugs 3 days ago | parent | prev | next [-]

They want you to become an expert at the new thing: knowing how to set up the context with perfect information. Which is arguably as much if not more work than just programming the damn thing.

Which theoretically could actually be a benefit someday: if your company does many similar customer deployments, you will eventually be more efficient. But if you are doing custom code meant just for your company... there may never be efficiency increase

loganmhb 3 days ago | parent | prev | next [-]

Especially concerning in light of that METR study in which developers overestimated the impact of AI on their own productivity (even if it doesn't turn out to be negative) https://metr.org/blog/2025-07-10-early-2025-ai-experienced-o...

dimal 3 days ago | parent | prev | next [-]

I still do a lot of refactoring by hand. With vim bindings it’s often quicker than trying to explain to a clumsy LLM how to do it.

For me, refactoring is really the essence of coding. Getting the initial version of a solution that barely works —- that’s necessary but less interesting to me. What’s interesting is the process of shaping that v1 into something that’s elegant and fits into the existing architecture. Sanding down the rough edges, reducing misfit, etc. It’s often too nitpicky for an LLM to get right.

skydhash 3 days ago | parent [-]

There are lots of project templates and generators that will get you close to where you can start writing business code and not just boilerplate.

jgb1984 3 days ago | parent | prev | next [-]

What worries me more is the steep decline in code quality. The python and javascript output I've seen the supposed best LLM's generate is inefficient, overly verbose and needlessly commented at best, and simply full of bugs at worst. In the best case they're glaringly obvious bugs, in the worst case they're subtle ones that will wreak havoc for a long time before they're eventually discovered, but by then the grasp of the developers on the codebase will have slipped away far enough to prevent them from being compete t enough to solve the bugs.

There is no doubt in my mind that software quality has taken a nosedive everywhere AI has been introduced. Our entire industry is hallucinating its way into a bottomless pit.

rossant 2 days ago | parent [-]

I'm very cautious about using LLM-generated code in production, but for one-off throwaway scripts that generate output I can manually verify, LLMs are a huge time saver.

cwnyth 3 days ago | parent | prev | next [-]

My LLM-generated code has so many bugs in it, that I end up knowing it better since I have to spend more time debugging/figuring out small errors. This might even be better: you learn something more thoroughly when you not only practice the right answers, but know how to fix the wrong answers.

bluefirebrand 3 days ago | parent | next [-]

That is absurd

If you write it by hand you don't need to "learn it thoroughly", you wrote it

There is no way you understand code between by reading it than by creating it. Creating it is how you prove you understand it!

cwnyth 11 hours ago | parent [-]

Or, you can copy and paste code from examples, StackExchange, open source code, etc. Or you can read about it once, use it, and forget why it worked.

Besides all that, though, it's really the fact that LLMs bring up interesting ways to tackle problems that I hadn't thought of before, or uncover neat libraries/packages (when I program in R) that I just am not aware of.

vlod 3 days ago | parent | prev [-]

For me the process of figuring out wtf I need to do and how I'm going to do it is my learning process.

For beginners my I think this is a very important step in learning how to break down problems (into smaller components) and iterating.

segmondy 2 days ago | parent | prev | next [-]

Doesn't worry me. I believed AI would replace developers and I still do to some degree. But AI is going to lack context, not just in business domain but how it would intersect with the tech side. Experienced developers will be needed. The vibe coders are going to get worse and will need experienced developers to come fix the mess. So no worries, the only thing that would suck would be if the vibe coders earn more money and experienced hand crafting devs are left to pick up the crumbs to survive.

ge96 3 days ago | parent | prev | next [-]

Tangent, there was this obnoxious effect for typing in editors the characters would explode, makes me think of a typewriter as you're banging away every character for some piece of code.

I imagine people can start making code (probably already are) where functions/modules are just boxes as a UI and the code is not visible, test it with in/out, join it to something else.

When I'm tasked to make some CRUD UI I plan out the chunks of work to be done in order and I already feel the rote-ness of it, doing it over and over. I guess that is where AI can come in.

But I do enjoy the process of making something even like a POSh camera GUI/OS by hand..

lupire 3 days ago | parent | prev | next [-]

Do you write a lot assembler, to make your note effective at higher level design?

taylorallred 3 days ago | parent [-]

Writing a lot of assembler would certainly make me more effective at designing systems such as compilers and operating systems. As it stands, I do not work on those things currently. They say you should become familiar with at least one layer of abstraction lower than where you are currently working.

tjr 3 days ago | parent | prev | next [-]

I'm concerned about this also. Even just reading about AI coding, I can almost feel my programming skills start to atrophy.

If AI tools continue to improve, there will be less and less need for humans to write code. But -- perhaps depending on the application -- I think there will still be need to review code, and thus still need to understand how to write code, even if you aren't doing the writing yourself.

I imagine the only way we will retain these skills is be deliberately choosing to do so. Perhaps not unlike choosing to read books even if not required to do so, or choosing to exercise even if not required to do so.

lucianbr 3 days ago | parent [-]

How could advances in programming languages still happen when nobody is writing code anymore? You think we will just ask the AI to propose improvements, then evaluate them, and if they are good ask the AI to make training samples for the next AI?

Maybe, but I don't think it's that easy.

tjr 3 days ago | parent [-]

If we were to reach a stage where humans don't write code any more, would there even be a need to have advances in programming languages? Maybe what we currently have would be good enough.

I don't know what future we're looking at. I work in aerospace, and being around more safety-critical software, I find it hard to fathom just giving up software development to non-deterministic AI tools. But who knows? I still foresee humans being involved, but in what capacity? Planning and testing, but not coding? Why? I've never really seen coding being the bottleneck in aerospace anyway; code is written more slowly here than in many other industries due to protocols, checks and balances. I can see AI-assisted programming being a potentially splendid idea, but I'm not sold on AI replacing humans. Some seem to be determined to get there, though.

sandeepkd 3 days ago | parent | prev | next [-]

Along the same lines, its probably little more than that. When it comes to software development, every iteration of execution/design is supposedly either faster or better based on the prior learnings for things that you have done by urself or observed very carefully.

dfee 3 days ago | parent | prev | next [-]

I’m concerned about becoming over reliant on GPT for code reviews for this reason (as I learn Rust).

marcosdumay 3 days ago | parent | prev | next [-]

My practice in writing assembly is so lost by now that it's not much different than if I never learned it. Yet, it's not really a problem.

What is different about LLM-created code is that compilers work. Reliably and universally. I can just outsource the job of writing the assembly to them and don't need to think about it again. (That is, unless you are in one of those niches that require hyper-optimized software. Compilers can't reliably give you that last 2x speed-up.)

LLMs by their turn will never be reliable. Their entire goal is opposite to reliability. IMO, the losses are still way higher than the gains, and it's questionable if this is an architectural premise that will never change.

ethan_smith 3 days ago | parent | prev | next [-]

The "paint-the-fence" analogy is spot-on, but AI can be the spotter rather than replacement - use it for scaffolding while deliberately practicing core patterns that strengthen your mental models.

wussboy 3 days ago | parent | next [-]

I suspect when it comes to human mastery there is no clear dividing line between scaffolding and core, and that both are important.

giancarlostoro 3 days ago | parent | prev [-]

As long as you understand the scaffolding and its implications, I think this is fine. Using AI for scaffolding has been the key thing for me. If I have some obscure idea I want to build up using Django, I braindump to the model what I want to build, and it spits out models, and what not.

Course, then there's lovable, which spits out the front-end I describe, which it is very impressively good at. I just want a starting point, then I get going, if I get stuck I'll ask clarifying questions. For side projects where I have limited time, LLMs are perfect for me.

Lerc 3 days ago | parent | prev [-]

I don't get this with boilerplate. To me boilerplate code is the code that you have to write to satisfy some predefined conditions that has little to do with the semantics of the code I am actually writing. I'm fine with AI writing this stuff for me if it does it reliably, or if the scale is small enough that I can easily spot and fix the errors. I don't see that aspect of coding to be much more than typing.

On the other hand I do a lot more fundamental coding than the median. I do quite a few game jams, and I am frequently the only one in the room who is not using a game engine.

Doing things like this I have written so many GUI toolkits from scratch now that It's easy enough for me to make something anew in the middle of a jam.

For example https://nws92.itch.io/dodgy-rocket In my experience it would have been much harder to figure out how to style scrollbars to be transparent with in-theme markings using an existing toolkit than writing a toolkit from scratch. This of course changes as soon as you need a text entry field. I have made those as well, but they are subtle and quick to anger.

I do physics engines the same way, predominantly 2d, (I did a 3d physics game in a jam once but it has since departed to the Flash afterlife). They are one of those things that seem magical until you've done it a few times, then seem remarkably simple. I believe John Carmack experienced that with writing 3d engines where he once mentioned quickly writing several engines from scratch to test out some speculative ideas.

I'm not sure if AI presents an inhibiter here any more than using an engine or a framework. They both put some distance between the programmer and the result, and as a consequence the programmer starts thinking in terms of the interface through which they communicate instead of how the result is achieved.

On the other hand I am currently using AI to help me write a DMA chaining process. I initially got the AI to write the entire thing. The final code will use none of that emitted output, but it was sufficient for me to see what actually needed to be done. I'm not sure if I could have done this on my own, AI certainly couldn't have done it on it's own. Now that I have (almost (I hope)) done it once in collaboration with AI, I think I could now write it from scratch myself should I need to do it again.

I think AI, Game Engines, and Frameworks all work against you if you are trying to do something abnormal. I'm a little amazed that Monument Valley got made using an engine. I feel like they must have fought the geometry all the way.

I think this jam game I made https://lerc.itch.io/gyralight would be a nightmare to try and implement in an engine. Similarly I'm not sure if an AI would manage the idea of what is happening here.