▲ | michaelfm1211 3 days ago | |||||||||||||
> The data also reveals a misalignment in resource allocation. More than half of generative AI budgets are devoted to sales and marketing tools, yet MIT found the biggest ROI in back-office automation—eliminating business process outsourcing, cutting external agency costs, and streamlining operations. Makes sense. The people in charge of setting AI initiatives and policies are office people and managers who could be easily replaced by AI, but the people in charge not going to let themselves be replaced. Salesmen and engineers are the hardest to replace, yet they aren't in charge so they get replaced the fastest. | ||||||||||||||
▲ | zoeysmithe 3 days ago | parent | next [-] | |||||||||||||
I think this is being overly complimenting to AI. I think the most obvious reason is that for almost all business use cases its not very helpful. All these initiatives have the same problem. Staff asking 'how can this actually help me,' because they can't get it to help them other than polishing emails, polishing code, and writing summaries which is not what most people's jobs are. Then you have to proofread all of this because AI makes a lot of mistakes and poor assumptions, on top of hallucinations. I dont think Joe and Jane worker are purposely not using to protect their jobs, everyone wants ease at work, its just these LLM-based AI's dont offer much outside of some use cases. AI is vastly over-hyped and now we're in the part of the hype cycle where people are more comfortable saying to power, "This thing you love and think will raise your stock price is actually pretty terrible for almost all the things you said it would help with." AI has its place, but its not some kind of universal mind that will change everything and be applicable in significant and fundamentally changing ways outside of some narrow use cases. I'm on week 3 of making a video game (something I've never done before) with Claude/Chat and once I got past the 'tutorial level' design, these tools really struggle. I think even where an LLM would naturally be successful (structured logical languages), its still very underwhelming. I think we're just seeing people push back on hype and feeling empowered to say "This weird text autogenerator isn't helping me." | ||||||||||||||
| ||||||||||||||
▲ | thisisit 3 days ago | parent | prev | next [-] | |||||||||||||
There is a reason why sales and marketing is first. It has to do with hallucination. People have figured out that even if you mess up sales/support/marketing, worse case you apologize and give a gift coupon. And then there is also the verbose nature of LLMs which makes it better suited to write marketing copies etc. On business process outsourcing like customer support lot of companies are using LLMs, so that part is unclear to me. Other BPO processes are accounting & finance, IT, human resources etc. And while companies can take that hallucination risk for customers, they see it as a serious risk. If for example, the accounting and finance operations get messed up due to AI hallucination companies will be in real hot water. Same goes for other back office functions like HR, compliance etc. So, most likely this statement is just hogwash. | ||||||||||||||
▲ | YetAnotherNick 3 days ago | parent | prev [-] | |||||||||||||
> MIT found the biggest ROI in back-office automation Can't find any source to this, even after searching in Google. To me who knows bit of this, I don't find it very believable. Compared to humans, AI struggles in places where a fixed structure and process is required. |