| ▲ | manmal 4 hours ago | |||||||||||||
> I can’t imagine any other example where people voluntarily move for a black box approach. Anyone overseeing work from multiple people has to? At some point you have to let go and trust people‘s judgement, or, well, let them go. Reading and understanding the whole output of 9 concurrently running agents is impossible. People who do that (I‘m not one of them btw) must rely on higher level reports. Maybe drilling into this or that piece of code occasionally. | ||||||||||||||
| ▲ | kace91 4 hours ago | parent | next [-] | |||||||||||||
>At some point you have to let go and trust people‘s judgement. Indeed. People. With salaries, general intelligence, a stake in the matter and a negative outcome if they don’t take responsibility. >Reading and understanding the whole output of 9 concurrently running agents is impossible. I agree. It is also impossible for a person to drive two cars at once… so we don’t. Why is the starting point of the conversation that one should be able to use 9 concurring agents? I get it, writing code no longer has a physical bottleneck. So the bottleneck becomes the next thing, which is our ability to review outputs. It’s already a giant advancement, why are we ignoring that second bottleneck and dropping quality assurance as well? Eventually someone has to put their signature on the thing being shippable. | ||||||||||||||
| ||||||||||||||
| ▲ | ink_13 4 hours ago | parent | prev | next [-] | |||||||||||||
An AI agent cannot be held accountable | ||||||||||||||
| ||||||||||||||
| ▲ | re-thc 4 hours ago | parent | prev [-] | |||||||||||||
> Anyone overseeing work from multiple people has to? That's not a black box though. Someone is still reading the code. > At some point you have to let go and trust people‘s judgement Where's the people in this case? > People who do that (I‘m not one of them btw) must rely on higher level reports. Does such a thing exist here? Just "done". | ||||||||||||||
| ||||||||||||||