▲ | la_fayette 3 days ago | ||||||||||||||||
Yes that sound like important and useful use cases. However, these are solved by boring old school ML models since years... | |||||||||||||||||
▲ | williamdclt 3 days ago | parent | next [-] | ||||||||||||||||
I think what they're saying is that you need the summaries to do these things | |||||||||||||||||
▲ | esafak 3 days ago | parent | prev | next [-] | ||||||||||||||||
It's easier and simpler to use an LLM service than to maintain those ad hoc models. Many replaced their old NLP pipelines with LLMs. | |||||||||||||||||
| |||||||||||||||||
▲ | aaomidi 3 days ago | parent | prev | next [-] | ||||||||||||||||
Sentiment analysis was not solved and companies were paying analyst firms shit tons of money to do that for them manually. | |||||||||||||||||
▲ | doorhammer 3 days ago | parent | prev [-] | ||||||||||||||||
So, I wouldn't be surprised if someone in charge of a QA/ops department chose LLMs over similarly effective existing ML models in part because the AI hype is hitting so hard right now. Two things _would_ surprise me, though: - That they'd integrate it into any meaningful process without having done actual analysis of the LLM based perf vs their existing tech - That they'd integrate the LLM into a core process their department is judged on knowing it was substantially worse when they could find a less impactful place to sneak it in I'm not saying those are impossible realities. I've certainly known call center senior management to make more hairbrained decisions than that, but barring more insight I personally default to assuming OP isn't among the hairbrained. | |||||||||||||||||
|