| ▲ | hereme888 3 days ago |
| AI is already so much better than 99% of customer support employees. It also improves brand reputation by actually paying attention to what customers are saying and responding in a timely manner, with expert-level knowledge, unlike typical customer service reps. I've used LLMs to help me fix Windows issues using pretty advanced methods, that MS employees would have just told me to either re-install Windows or send them the laptop and pay $hundreds. |
|
| ▲ | kldg 2 days ago | parent | next [-] |
| As someone who was recently screwed over by LLM CSR, I'd respectfully disagree. Amazon replaced their offshore humans with LLMs recently. They put the "subscribe to Prime" button on the right-hand side of the screen when you go to checkout. It's a one-click subscription. I accidentally clicked it a few days ago. I immediately hop on customer service chat to ask for a refund. I was surprised to be talking to an LLM rather than a human, but go ahead and explain what happened and state I want the transaction for the subscription canceled. It offers to cancel the subscription at the end of the 30-day subscription. I decline, noting I want a refund for the subscription I didn't intend to take. It repeats it can cancel the subscription at the end of 30-day subscription. I ask for human. It repeats. I ask for human again. It repeats. I disconnect. Amazon knows what it's doing. |
| |
| ▲ | Nextgrid 2 days ago | parent [-] | | This occurrence has nothing to do with AI? The reason AI doesn't want to grant you the refund is because it's not been given the ability to do so. It would be no different with a human. If Amazon wanted to give you the ability to get a refund for unused Prime benefits, it would allow the AI to do it, or even give you a button to do it yourself. | | |
| ▲ | kldg 2 days ago | parent [-] | | I can't really argue this except to say trust me, bro, I've been an Amazon customer for over 20 years, and there has never been an issue, no matter how unusual, the CSR was not able to resolve inside about five minutes. The LLM was completely on rails with only specific whitelisted actions available to it. Even if a human couldn't do the specific action, they could explain why instead of word-for-word repeating an irrelevant part of their script. They don't trust the LLM so they cripple what it can do, would be the generous interpretation. I actually think they're intentionally crippling the LLMs' access to accounts, though, to reduce their spend not on CSRs, but on CSR actions for, for example, refunds, where the LLM becomes an excuse for the change; they can hide behind what they'll call technical issues or teething pains. | | |
| ▲ | AbstractH24 a day ago | parent [-] | | I know exactly what you mean, but its hard to tell if its something they are ok with because they are slowly becoming less user-centric and willing to make refunds/exceptions. Or if its the rigidity of AI. If it was the former, they would help you when you esclated it. So I think they are just becoming more greedy. |
|
|
|
|
| ▲ | nurumaik 3 days ago | parent | prev | next [-] |
| I don't want AI customer support. I want open documentation so I can ask AI if I want or ask human support if it's not resolvable with available documentation All my interactions with any AI support so far is repeatedly saying "call human" until it calls human |
| |
| ▲ | aydyn 2 days ago | parent | next [-] | | This is such a HN comment lol. Customer support is when all the documentation already failed and you need a human. | | |
| ▲ | aldonius 2 days ago | parent [-] | | I'll put in a good word for a chatbot hooked up to the documentation (e.g. at $dayjob we use Intercom Fin) acting as level 0.5 support. At $dayjob our customers are nontechnical so they don't always know what to search for, so the LLM/RAG approach can be quite handy. It answers about 2/3 of incoming questions, can escalate to the humans as needed, and scales great. |
| |
| ▲ | inquirerGeneral 3 days ago | parent | prev [-] | | [dead] |
|
|
| ▲ | onlyrealcuzzo 2 days ago | parent | prev | next [-] |
| > AI is already so much better than 99% of customer support employees. 99% seems like a pulled-out-of-your-butt number and hyperbolic, but, yes, there's clearly a non-trivial percentage of customer support that's absolutely terrible. Please keep in mind, though, that a lot of customer support by monopolies is intended to be terrible. AI seems like a dream for some of these companies to offer even worse customer service, though. Where customer support is actually important or it's a competitive market, you tend to have relatively decent customer support - for example, my bank's support is far from perfect, but it's leaps and bounds better than AT&T or Comcast. |
| |
| ▲ | ponector 2 days ago | parent [-] | | >> 99% seems like a pulled-out-of-your-butt number I don't agree. AI support is as useless as real customer support. But it is more polite, calm, with clear voice, etc. Much better, isn't it? | | |
| ▲ | Jensson 2 days ago | parent [-] | | AI support never solves your issue, its just there to try to make you go away, but human support sometimes do even if they mostly just try to go away they can help now and then. |
|
|
|
| ▲ | bilsbie 3 days ago | parent | prev | next [-] |
| This is great but most customer support is actually designed as a “speed bump” for customers. Cancel account- have them call someone. Withdraw too much - make it a phone call. Change their last name? - that would overwhelm our software, let’s have our operator do that after they call in. Etc. |
| |
| ▲ | gruez 2 days ago | parent [-] | | >Change their last name? - that would overwhelm our software, let’s have our operator do that after they call in. That doesn't make much sense. Either your system can handle it or it can't. Putting a support agent in front isn't going to change that. | | |
| ▲ | Gazoche 2 days ago | parent | next [-] | | Having been a customer to a half-dozen different banks in almost as many countries, I can assure you that this is very common. You'd be surprised how often the user interface stonewalls you with a "please call support" for even the most basic contact details update operation. | |
| ▲ | ipython 2 days ago | parent | prev | next [-] | | The backend can. But what’s exposed to customers will be a very very small subset of that capability. Hence why only the csrs can perform that function. The business undoubtedly did a crude cost/benefit analysis where the cost to expose and maintain that public interface vastly outstrips the cost for the few people that have to call in and change their name. | | |
| ▲ | com2kid 2 days ago | parent | next [-] | | > The business undoubtedly did a crude cost/benefit analysis where the cost to expose and maintain that public interface vastly outstrips the cost for the few people that have to call in and change their name. Haha, not likely. In reality the org is so drowned in technical debt that changing the last name involves manually running 3 different scripts that hit three different DBs directly and the estimate from the 3rd party dev consultancy that maintains the mess for how long it'd take to make a safe publicly usable endpoint is somewhere between 2 years and forever. | | | |
| ▲ | lurking_swe 2 days ago | parent | prev [-] | | OK, so why not have a customer support bot add the “operation description” into a message queue (SQS, Kafka, whatever) if a formal API doesn’t exist for that operation? The csrs can then handle that task async and the customer can get a sms / email when their request is fulfilled? Why force things to be synchronous and irritate the customer? It’s not exactly a difficult design problem. Unless I’m missing some thing. |
| |
| ▲ | sedawkgrep 2 days ago | parent | prev [-] | | I think you missed the point of the parent. All these things are speed bumps and the "reasons" for having them are mostly incidental, as the main reason is to avoid the expense of having any more customer support personnel / infrastructure than is absolutely necessary to function. |
|
|
|
| ▲ | pluc 3 days ago | parent | prev | next [-] |
| Except AI support agents are only using content that is already available in support knowledge bases, making the entire exercise futile and redundant. But sure, they're eloquent while wasting your time. |
| |
| ▲ | somenameforme 2 days ago | parent | next [-] | | "Only" kind of misses the benefit though. I'm very bearish on "AI", but this is an absolutely perfect use case for LLMs. The issue is that if you describe a problem in natural language on any search engine, your results are going to be garbage unless you randomly luckboxed into somebody asking, with near identical verbiage, the question on some Q&A site. That is because search is still mostly stuck in ~2003. But now ask the exact same thing of an LLM and it will generally be able to provide useful links. There's just so much information out there, but search engines just suck because they lack any sort of meaningful natural language parsing. LLMs provide that. | | |
| ▲ | another-dave 2 days ago | parent | next [-] | | Speaking of which, could we apply vector embeddings to search engines (where crawled pages get indexed by their vector embeddings rather than raw text) and use that for better fuzzy search results even without an LLM in the mix? (Might be a naïve question, I'm at the edge of my understanding) | | |
| ▲ | com2kid 2 days ago | parent | next [-] | | > Speaking of which, could we apply vector embeddings to search engines (where crawled pages get indexed by their vector embeddings rather than raw text) and use that for better fuzzy search results even without an LLM in the mix? Yes, this is how all the new dev documentation sites work now days, with their much improved searches. :-D | | |
| ▲ | another-dave 2 days ago | parent [-] | | ah cool right! I didn't know that. One for me to check out and understand more. Thanks! |
| |
| ▲ | esafak 2 days ago | parent | prev [-] | | Why stop there? The LLM can synthesize the results and spare you the work. | | |
| ▲ | another-dave 2 days ago | parent [-] | | I'm talking about the scenario the GP referenced — where if you search for say "holiday" but get no results because the pages only use the word "vacation" which AFAIK is still a problem in regular search. LLMs inherently would introduce the possibility of hallucinations, but just using the vectors to match documents wouldn't, right? | | |
| ▲ | esafak 2 days ago | parent [-] | | No, llms still use similarity search for candidate generation, unless you don't give them any tools. |
|
|
| |
| ▲ | pluc 2 days ago | parent | prev [-] | | "Instead of making search smarter we just decided to make everyone stupider" Why invest in making users more savvy when you can dumb down everything to 5 year old level eh |
| |
| ▲ | mava_app 3 days ago | parent | prev | next [-] | | There are AI agents that train from knowledge bases but also keep improving on actual conversations. For example, our Mava bot actually learns from mods directly within Discord servers. So it's not about replacing human mods but assist them so they can take better care of users in the end. | | |
| ▲ | pluc 2 days ago | parent [-] | | I don't see how this is any different than enriching knowledge bases from feedback and experience. You just find yourself duplicating all the information, locking yourself in your AI vendor and investing in a technology that doesn't add anything to what you had before. It's utterly nonsensical. | | |
| ▲ | hereme888 2 days ago | parent [-] | | You're going to browse through a manual every time you need to fix something in some app in some particular OS version? I want: > Respond with terminal command to do X >> `complex terminal command code block` > oh we need to run that on all such and such files >> script.py | | |
| ▲ | pluc 2 days ago | parent [-] | | > You're going to browse through a manual every time you need to fix something in some app in some particular OS version? Yes, that's literally how you learn things. I can't understand how anyone on this forum thinks otherwise. Hackers are supposed to be people who thrive in unknown contexts, who thirst for knowledge of how things work. What you are suggesting is brain atrophy. It's the death of knowledge for profit and productivity. Fuck all of that. | | |
| ▲ | hereme888 2 days ago | parent [-] | | My cognitive energies are reserved for other things. This is the point of AI: do boring tasks so humans can spend their energy on loftier/more important things. | | |
| ▲ | bluefirebrand 2 days ago | parent [-] | | If you can't be bothered to do the boring stuff ever, you won't ever develop the skills that you need to make anything loftier They might be boring but they are nonetheless foundational | | |
| ▲ | hereme888 a day ago | parent [-] | | But the point is that the boring things I refer to are not foundational or even related to my field.
I have zero formal education in computers. LLMs are my personal IT experts, for pennies on the dollar.
I'm not interested in formally learning programming, yet vibe-code apps I want. | | |
| ▲ | bluefirebrand 14 hours ago | parent [-] | | > I'm not interested in formally learning programming, yet vibe-code apps I want. In that case I'm really not sure why professional software developers should be interested in your opinion on the technology at all | | |
| ▲ | hereme888 5 hours ago | parent [-] | | I don't understand how your comment is related to anything I've said. "AI is great for customer service" is a statement that can be made by anyone who has experienced both LLMs and human reps. |
|
|
|
|
|
|
|
| |
| ▲ | scarface_74 3 days ago | parent | prev | next [-] | | It’s even worse than you think. I work with Amazon Connect. Now the human agent doesn’t have to search the Knowledge Base manually, likely answers will automatically be shown to the agent based on the conversation that you are having. They are just regurgitating what you can find for yourself. But I can’t imagine ever calling tech support for help unless it is more than troubleshooting and I need them to actually do something in their system or it’s a hardware problem where I need a replacement. | |
| ▲ | d1sxeyes 2 days ago | parent | prev | next [-] | | I would agree, but I’ve spent the last ten years or so working with outsourced tech support and I guarantee you, a lot of people call us just because they can’t be bothered to look for themselves. | | |
| ▲ | GuinansEyebrows 2 days ago | parent | next [-] | | > a lot of people call us just because they can’t be bothered to look for themselves if the service offered is "support" then why is a phone call less acceptable than reading documentation? | | |
| ▲ | d1sxeyes 2 days ago | parent [-] | | Exactly. It’s not futile or redundant, it’s something people want. |
| |
| ▲ | pluc 2 days ago | parent | prev [-] | | Getting instant answers without having to deploy any effort isn't going to make the problem go away, it's going to make us dependent on the solution. |
| |
| ▲ | klodolph 3 days ago | parent | prev | next [-] | | Most of my questions are answerable from the support knowledge base. | | |
| ▲ | SpaceManNabs 3 days ago | parent [-] | | If i am calling support, it is probably because I already scoured the resources. Over the past 3 years of calling support any service or infrastructure (bank, health insurance, doctor, wtv), over like 90% of my requests were things only solvable via customer support or escalation. I only keep track because I document when I didn't need support into a list of "phone hacks" (like press this sequence of buttons when calling this provider). Most recently, I went to an urgent care facility a few weekends ago, and they keep submitting claims to the arm, of my insurance, that is officed in a different state instead of my proper state. |
| |
| ▲ | hereme888 2 days ago | parent | prev [-] | | AI has all sorts of technical knowledge, plus massive working memory, and high IQ-ish. It's vastly, vastly superior to most IT support agents. | | |
| ▲ | kbelder 2 days ago | parent [-] | | True, partially, but it's also vastly inferior to most IT support agents at the same time. I love a good AI to help search through large documentation basis for the particular issue I'm experiencing. But it is clear when the problem I am having is outside of the AI's sometimes infantile ability to understand, and I need the ability to bypass it. |
|
|
|
| ▲ | cafebeen 2 days ago | parent | prev | next [-] |
| When asking customers how well they were helped by the customer support system (via CSAT score), I've found industry-standard AI support agents will generally perform worse than a well-trained human support team. AI agents are fine at handling some simple queries, e.g. basic product and order informatino, but support issues are often biased towards high complexity, because otherwise they could solve it in a more automated. I'm sure it depends on the industry, and whether the customer's issue is truly novel. |
| |
| ▲ | aydyn 2 days ago | parent [-] | | I think the main problem is access, not quality. I.e. AI isn't allowed to offer me a refund because my order never arrived. For that, I have to spend 20 minutes on the phone with Mike from India. |
|
|
| ▲ | pesus 2 days ago | parent | prev | next [-] |
| Improves brand reputation? I don't think I've seen a single case where someone is glad to talk to an LLM/chat bot instead of an actual person. Personally, I think less of any company that uses them. I've never seen one be actually useful, and they seem to only really regurgitate links to FAQ pages or give the most generic answers possible while I fight to get a customer service number so I can actually solve the problem at hand. |
| |
| ▲ | hereme888 2 days ago | parent [-] | | I use SOTA LLM chatbots to solve issues that would take a long time via human customer service reps. In fact, LLMs solve things quicker than it takes to get a human on the phone/chat/forum response. |
|
|
| ▲ | nkingsy 3 days ago | parent | prev | next [-] |
| It isn’t empowered to do anything you can’t already do in the UI, so it is useless to me. Perhaps there is a group that isn’t served by legacy ui discovery methods and it’s great for them, but 100% of chat bots I’ve interacted with have damaged brand reputation for me. |
| |
| ▲ | another-dave 2 days ago | parent [-] | | A chatbot for those sorts of queries that are easily answerable is great in most scenarios though to "keeps the phone lines clear" The trouble is when they gatekeep you from saying "I know what I'm doing, let me talk to someone" |
|
|
| ▲ | GuinansEyebrows 2 days ago | parent | prev | next [-] |
| > AI is already so much better than 99% of customer support employees. i have yet to experience this. unfortunately i fear it's the best i can hope for, and i worry for those in support positions. |
|
| ▲ | IAmGraydon 2 days ago | parent | prev [-] |
| MS customer service is perhaps the lowest bar available. One look at their tech support forums tells you that most of what they post is canned garbage that is no help to anyone. AI is not better than a good customer service team, or even an above-average one. It is better than a broken customer service team, however. As others have noted, 99% is hyperbolic BS. |