Remix.run Logo
browningstreet 12 hours ago

Maybe author knows that too, but wants to talk about it nonetheless. First line of article: “Commits are a terrible metric for output, but they're the most visible signal I have.”

the_arun 10 hours ago | parent | next [-]

Using AI we can make 1000s of commits per day. This metric becomes even more pointless in the days of AI. If we increase sales, New subscription count, reduced bug count, reduced incidents etc., those can be real metrics. I'm sure I am preaching to the choir.

tecleandor 10 hours ago | parent [-]

I have coworkers commiting tens or hundreds of thousands of "lines of code" a week, because they'll push whatever the AI gives them, including dependencies and virtualenvs, without any review.

Of course, at the same time we're getting dozens of alerts a week about services deployed open to the Internet without authentication and full of outdated vulnerable libraries (LLMs will happily add two or three years old dependencies to your lockfiles).

duskdozer 5 hours ago | parent [-]

Set the AIs off on those alerts and look at how many more alerts per week are now getting solved due to AI!

berkes 4 hours ago | parent [-]

The good old Cobra effect?

https://en.wikipedia.org/wiki/Perverse_incentive?wprov=sfla1

skydhash 12 hours ago | parent | prev [-]

What about number of working features or system completeness? Current state vs desired state is fairly visible.

101011 12 hours ago | parent | next [-]

how do you define system completeness? what if you ship one really big feature vs three really small ones?

I would posit that you need extra context to obtain meaning from those metrics, which inherently makes them less visible

skydhash 11 hours ago | parent [-]

System completeness can be defined from the product definition. The latter is where requirements and definitions of done come from. Working features are the most important thing and most principles and techniques were about reducing the cost to get there.

12 hours ago | parent | prev [-]
[deleted]