Remix.run Logo
TeMPOraL 4 hours ago

Reliable information on this does not exist on vendor sites, though. It exists on Reddit and in books and in med/physio papers and bunch of other places a SOTA model has read in training or can (for now) access via web search.

LLMs are already very good for shopping, but only as long as they sit on the outside.

meroes an hour ago | parent | next [-]

Idk I earnestly tried using LLMs to find me the smallest by volume regular ATX PC case 3 months ago and it was a nightmare. That info is out there, but it could not avoid mentioning ITX, mini atx (sometimes because Reddit posters messed up) and just missed a bunch of cases. And letting in any mistakes meant I had to double check every volume calculation it did.

I found the Jonsbo D41 without the help of LLM despite trying. (There might be a few smaller but they are 3x the price)

LLMs don’t weigh and surveil the options well. They find some texts like from Reddit in this case that mention a bunch subset of cases and that text will heavily shape the answer. Which is not what you want a commerce agent to do, you don’t want text prediction. I doubt that gives the obscure but optimal option in most cases.

TZubiri 3 hours ago | parent | prev [-]

We are talking about a hypothetical sales chatbot which would be built alongside the business, so they absolutely have the capacity and information necessary to train the chatbot to advise their own clients.

TeMPOraL 2 hours ago | parent [-]

> they absolutely have the capacity and information necessary to train the chatbot to advise their own clients.

That doesn't follow. In fact, having this capacity and information creates a moral dilemma, as giving customers objectively correct advice is, especially in highly competitive markets, bad for business. Ignorance is bliss for businesses, because this lets them bullshit people through marketing with less guilt, and if there's one thing any business knows, is that marketing has better ROI than product/service quality anyway.