Remix.run Logo
atonse 3 hours ago

I don't have an answer for this, and won't pretend to.

But my take on this is that accountability will still be a purely human factor. It still is. I recently let go of a contractor who was hired to run our projects as a Scrum/PM, and his tickets were so bad (there were tickets with 3 words in them, one ticket was in the current sprint, that was blocked by a ticket deep in the backlog, basic stuff). When I confronted him about them, he said the AI generated them.

So I told him that:

1. That's not an excuse, his job is to verify what it generated and ensure it's still good.

2. That actually makes it look WORSE, that not only did he do nearly 0 work, that he didn't even check the most basic outputs. And I'm not anti-AI, I expressly said that we should absolutely use AI tools to accelerate our work. But that's not what happened here.

So you won't get to say (at least I think for another few years) "my AI was at fault" – you are ultimately responsible, not your tools. So people will still want to delegate those things down the chain. But ultimately they'll have to delegate to fewer people.

jgilias an hour ago | parent | next [-]

In general I agree. But it’s somehow very unlikely for the AI to generate a three word ticket. That’s what humans do. AI might generate an overly verbose and specific ticket instead.

eisa01 an hour ago | parent | prev [-]

What drives that behavior is what I like to call human slop :)