Remix.run Logo
sdesol 6 days ago

> Do people really try to one-shot their AI tasks?

Yes. I almost always end with "Do not generate any code unless it can help in our discussions as this is the design stage" I would say, 95% of my code for https://github.com/gitsense/chat in the last 6 months were AI generated, and I would say 80% were one shots.

It is important to note that I can easily get into the 30+ messages of back and forth before any code is generated. For complex tasks, I will literally spend an hour or two (that can span days) chatting and thinking about a problem with the LLM and I do expect the LLM to one shot them.

jplusequalt 6 days ago | parent [-]

Do you feel as if your ability to code is atrophying?

sdesol 6 days ago | parent [-]

Not even remotely since the 5% that I need to write is usually quite complex. I do think my writing proficiency will decrease though. However my debugging and problem solving skills should increase.

Having said all of that, I do believe AI will have a very negative affect on developers where the challenge is skill and not time. AI is implementing things that I can do if given enough time. I am literraly implementing things in months that would have taken me a year or more.

My AI search is nontrivial but it only took two months to write. I should also note the 5% that I needed to implement was the difference between throw away code and a usuable search engine.

jplusequalt 5 days ago | parent [-]

>Not even remotely since the 5% that I need to write is usually quite complex.

Not sure I believe this. If you suddenly automate away 95% of any task, how could it be the case you retain 100% of your prior abilities?

>However my debugging and problem solving skills should increase

By "my", I assume you mean "my LLM"?

>I do think my writing proficiency will decrease though.

This alone is cause for concern. The ability for a human being to communicate without assistance is extremely important in an age where AI is outputting a significant fraction of all new content.

sdesol 5 days ago | parent [-]

> Not sure I believe this. If you suddenly automate away 95% of any task, how could it be the case you retain 100% of your prior abilities?

I need to review like crazy now, so it is not like I am handing off my understanding of the problem. If anything, I learn new things from time to time, as the LLM will generate code in ways that I haven't thought of before.

The AI genie is out of the bottle now and I do believe in a year or two, companies are going to start asking for conversations along with the LLM generated code, which is how I guess you can determine if people are losing their skill. When my code is fully published, I will include conversations for every feature/bug fix that is introduced.

> The ability for a human being to communicate without assistance is extremely important

I agree with this, but once again, it isn't like I don't have to review everything. When LLMs get much better, I think my writing skills may decline, but as it currently stands, I do find myself having to revised what the LLM writes to make it sound more natural.

Everything is speculation at this point, but I am sure I will lose some skills but I also think will gain new ones by being exposed to something that I haven't thought of before.

I wrote my chat app because I needed a more comfortable way to read and write *long* messages. For the foreseeable future, I don't see my writing proficiency to decrease in any significant manner. I can see myself being slower to write in the future though, as I find myself being very comfortable speaking to the LLM in a manner that I would not to a human. LLMs are extremely good at inferring context, so I do a lot lazy typing now to speed things up, which may turn into a bad habit.