Remix.run Logo
whatever1 3 hours ago

I love that LLMs are already copying humans when it comes to estimates. When asked for estimate they provide a very padded estimate of weeks.

Then they proceed to implement the solution in 30”.

youknownothing an hour ago | parent [-]

That's because LLMs don't actually think, they pattern-match. Since all the existing estimations out there are made assuming that a human is going to perform the task, the estimation that the LLM provides has the same inherent assumption. The LLM doesn't have a corpus of LLM-led estimations so it cannot take that into account.