Remix.run Logo
CSSer 13 hours ago

LLMs can be used to quickly mulch data into a digestible format that at least used to take effort. Friction is a natural deterrent for bad behavior. Beyond that, however, is the fact that your user interactions with most applications used to be quite coarse. A "customer story" was just that: a story we crafted from the data we have available to us about our customers. We have to build it from heuristics like bounce rate, scroll distance, and other thorough, idiosyncratic and diligent abandonment metrics.

Now why bother? Your customer will ask their silver ball (LLMs) anything and everything, and you can directly do bulk analysis on (in theory) the entire interaction, including all of your customer's emotions available via text.

Lastly, your customers are now eager about this tool, so they're excited to integrate/connect everything to it. In a rush to satisfy customers, many companies have lazily built LLM integrations that could even undermine their business model. This pushes yet more data into the LLM. This isn't just telemetry like file names, this is full read access to all of your files. How is that not connected to privacy?