Remix.run Logo
bronco21016 3 days ago

I use AI coding almost daily. I’m able to move my repositories into context easily through the multitude of AI coding tools and I see a massive boost in productivity. I say this as a junior dev. Often the outputs are “almost” and I make the necessary fixes to get it the rest of the way there.

To contrast with this, my org tried using a simple QA bot for internal docs and has been struggled to move anything beyond proof of concept. The proof of concepts have been awful. It answers maybe 60-70% of questions correctly. The major issue seems to be related to taking PDFs laced with images and poorly written explanations. To get decent performance from these RAG bots, a large FAQ has to be written for every question it gets wrong. Of course this is just my org so it can’t necessarily be extrapolated across industry. However, how often have people come across a new team and find there is little to no documentation, poorly written documentation, or outdated documentation?

Where am I going with these two thoughts? Maybe the blocker to pushing more adoption within orgs is twofold, getting the correct context into the model and having decent context to start with.

Extracting value from these things is going to require a heavy lift in data curation and developing the harnesses. So far most of that effort has gone into coding. It will take time for the nontechnical and technical to work together to move the rest of an org into these tools in my opinion.

The big bet of course then is ROI and time to adoption vs current burn rates of the model providers.

latentsea 3 days ago | parent | next [-]

Yep. There are a lot of things that apply equally to human engineering teams in terms of productivity. Poor documentation and information architecture is a thing I have seen time and time again, and is always something I put time into course correcting for because it makes performing cognitive work much easier. Same goes for poorly factored codebases. They make doing any work feel like wading through mud. Throughout my career I have done a lot of work on what I would call platform engineering and product re-engineering and it's always to course correct for how difficult an environment has become to work in.

Agents are going to struggle with those same difficulties the way humans do too. You need to put work into making an environment productive to work in, and after having purposely switched my development workflow for the stuff I do outside of work to being "AI first on mobile", that's such a bandwidth constrained setup that it's really helping me to find all the things to optimise for to increase the batting average and minimise the back and forth.

throw1235435 2 days ago | parent | prev [-]

I'm wondering if the ROI will be worth it anytime soon for anything other than coding, and anything that can be publically scraped of the internet. Or more to the point things that are at a enterprise level and require paid staff to train the model for a particular domain at an expert level of quality - and the ROI of such a task to be positive.

The thing is none of this is really happening under typical economic assumptions like ROI, rate of return, net PV, etc.

You see - on a pure ROI basis none of this should of existed. Even for coding I think - a lot of this is fuelled by investor money and even if developers took up the tooling I'm not sure it would pay off the capital investment. DeepMind wouldn't of been funded by Google, transformers would of never been invented if it was just based on expected ROI, etc. Most companies can't afford engineers/AI researchers on the side "just in case" it pays off especially back then when AI was a pie in the sky kind of thing. The only reason why any of this works is because Big Tech has more money than they can invest, and the US system punishes dividends meaning companies can justify "expected bad" investments as long as they can be dressed up and some pay off. They almost operate like internal VC's/funds because they had the money to do so.

This allows "arms race" and "loss leading" dynamics to take hold and be funded - which isn't about economics as much anymore. Most other industries/domains don't have the war chest or investors with very very deep pockets to make that a reality.

Sadly I think we as SWE's think it will also be other professions; what if instead we just disrupted our own profession and a few other smaller targets?