Remix.run Logo
superfrank 6 hours ago

My company did something similar (dashboard to track tokens). It was made available to managers about two weeks before it was available to everyone, so I got to see all my reports' usage before they knew they were being tracked.

The dashboard got announced publicly and just about everyone's usage went up by 100%-200% almost immediately and hasn't come back down, but nothing I'm tracking shows any increase in output since then. We absolutely saw productivity gains a few months ago, but it feels like now people are just burning tokens for the sake of it.

On top of that, as a reaction to the rising costs, we've now gone from unlimited token use to every engineer now having a monthly token budget of $600. I get why that was done, but we're a publicly traded US tech company worth 10s of billions of dollars. We're not hurting for money and the knock on effects are just crazy. For example, I had an engineer in sprint planning say about a large migration type ticket, "Can we hold that ticket until the end of the month? I don't want to burn through all my tokens this early in the month." I just cannot imagine that that's the culture that our executive team was trying to cultivate when they first purchased these tools.

I'm not anti-AI and actually really enjoy using AI for development, but over and over I've watched business leaders shoot themselves in the foot trying to force more AI use on their employees in pursuit of ever increasing productivity. I just keep thinking that there's no way that any productivity gains we've seen from the forced, tracked AI usage are enough to offset the productivity lost from anxiety and churn caused by the unrealistic productivity expectations, vanity metrics, and mass layoffs that have come along with increased AI adoption.

swingboy 3 hours ago | parent | next [-]

How were you measuring productivity gains (prior to the dashboard)?

superfrank 2 hours ago | parent [-]

Without going too into detail, my company is really, really big on estimates and predictable delivery timelines. An entire years worth of work is speced out, estimated, and scheduled by the end of October the previous year. It's a really terrible process, IMO, but it's the process so it is what it is.

Normally, most teams (mine included) are about 10% behind their plan by the end of Q1. This year my team is closer to 10% ahead despite the fact that we're down one engineer due to a small re-org at the end of last year. These projects were planned and estimated before AI was in heavy use and when, at best, most AI focused devs were still using it like smart auto-complete. Essentially, we estimated the projects before AI was heavily used and we're consistently beating those estimates by a good amount which is not how all previous years have gone.

The AI metrics dashboards didn't roll out until mid-March and, while I'm still seeing us beat our estimates from last year, we're not seeing any additional gains. Basically, all of Q1 we had AI and no dashboards and were beating 2025 estimates by X%. For Q2 we had dashboards and extra pressure to use AI and we're still seeing those X% gains, but no additional gains despite higher token usages.

We also have KPIs around completing a certain number of a certain type of Jira ticket that customers can file and we've seen a similar pattern of an sharp increase in tickets completed in Dec-Feb and then the new rate holds, but no additional increase after the company started pushing AI usage.

akomtu 5 hours ago | parent | prev [-]

Those executives are simply implementing the directive to inject as much AI as possible into every gear of the economy. Their bonuses depend on this. The idea is that if the world economy becomes dependent on this AI monstrosity, we won't be able to get rid of it. It will be like a situation with a nasty parasite that does a lot of harm, but cannot be removed without the host dying.