| ▲ | maccard 3 days ago | |
Can you link to a changelog that shows the 5-10x feature increases? I keep hearing this, but I don’t see anything I use ever actually shipping like this, or people backing this up with any sort of proof. | ||
| ▲ | adamtaylor_13 2 days ago | parent | next [-] | |
Our projects are closed source due to our clients owning the code, but I can offer anecdote. We have a client whose business operates on 2-3 very niche SaaS applications in the veterinary/animal medicine space. In a span of about 6 months, we completely ripped out 2 of those 3 and are working on replacing the 3rd one right now. We've done this with a single senior engineer working with the client between 20-40 hours per week with no major regressions. The business has been able to continue working as usual with no disruptions throughout this process. Obviously it's hard to measure this objectively, but I can't imagine having done this pre-AI with zero downtime and having replaced those SaaS applications in that timeframe. | ||
| ▲ | toraway 2 days ago | parent | prev [-] | |
That reminds me of a chart I saw posted in HN comments recently that someone created tracking bullet points in Claude Code release notes per day that was cited as "proof of a step change" in AI development over the last year. It showed like a dozen or so on average that jumped to to like over 50 one month and stayed around that number. (Not the exact same chart but similar idea, I guess it's sort of a meme: https://imgur.com/a/YrNGYOR) So I looked at the most recent CC release notes on Github and the majority look like this:
I'd be extremely interested to know what percentage of these were just fixing last week's Claude Code written PR that no human ever set eyes on.But hey, all that churn looks great on charts being circulated on social media as free advertising for their flagship product (and consequently the company's valuation) so never mind, LGTM! | ||