Remix.run Logo
zdragnar 2 days ago

There's a library I use with extensive documentation- every method, parameter, event, configuration option conceivable is documented.

Every so often I get lost in the docs trying to do something that actually isn't supported (the library has some glaring oversights) and I'll search on Google to see if anyone else came up with a similar problem and solution on a forum or something.

Instead of telling me "that isn't supported" the AI overview instead says "here's roughly how you would do it with libraries of this sort" and then it would provide a fictional code sample with actual method names from the documentation, except the comments say the method could do one thing, but when you check the documentation to be sure, it actually does something different.

It's a total crapshoot on any given search whether I'll be saving time or losing it using the AI overview, and I'm cynically assuming that we are entering a new round of the Dark Ages.

XorNot 2 days ago | parent | next [-]

I have the Google AI overview adblocked and I keep it up to date because it's an unbelievably hostile thing to have in your information space: it sounds truthy, so even if you try to ignore it it's liable to bias the way you evaluate other answers going forward.

It's also obnoxious on mobile where it takes up the whole first result space.

gorbypark 2 days ago | parent | prev | next [-]

There's an attempt to kinda have these things documented for AIs, called llms.txt, which are generally hosted on the web.

In theory, an AI should be able to fetch the llms.txt for every library and have an actual authoritative source of documentation for the given library.

This doesn't work that great right now, because not everyone is on board, but if we had llms.txt actually embedded in software libraries...it could be a game changer.

I noticed Claude Code semi regularly will start parsing actual library code in node_modules when it gets stuck. It will start by inventing methods it thinks should exist, then the typescript check step fails, and it searches the web for docs, if that fails it will actually go into the type definition for the library in node_modules and start looking in there. If we had node_modules/<package_name>/llms.txt (or the equivalent for other package managers in other languages) as a standard it could be pretty powerful I think. It could also be handled at the registry level, but I kind of like the idea of it being shipped (and thus easily versioned) in the library itself.

AlecSchueler 2 days ago | parent [-]

> In theory, an AI should be able to fetch the llms.txt for every library and have an actual authoritative source of documentation for the given library.

But isn't the entire selling point of the LLM than you can communicate with it in natural language and it can learn your API by reading the human docs?

languid-photic 2 days ago | parent [-]

Yes, but I think part of the reason for llms.txt is optimize context. eg beyond content, the human docs often have styling markup which wastes tokens.

AlecSchueler 2 days ago | parent [-]

Hmm, sounds like LLMs.txt might be nicer for humans to read all well.

gorbypark a day ago | parent [-]

Sometimes they are! I use the expo docs as a human all the time. Some project however seem to really "minify" their docs and are less readable. I'm not quite sure how minifying really saves space as it seems like they are just removing new lines as the docs are still in markdown...

Good for humans example: https://docs.expo.dev/llms-full.txt

Bad for humans example: https://www.unistyl.es/llms-small.txt

IshKebab 2 days ago | parent | prev | next [-]

I mean... Yeah I've had ChatGPT tell me you can't do things with Make that you totally can. They aren't perfect. What do you expect Google to do about it?

zdragnar 2 days ago | parent [-]

Don't ship fundamentally broken products would be step one for me. Sadly, there's a lot of people who are really excited about things that only occasionally work.

IshKebab 2 days ago | parent [-]

Lots of things only occasionally work but are still very useful. Google search for example.

Would you say "pah why are you shipping a search engine that only sometimes finds what I'm looking for?"?

zdragnar 2 days ago | parent [-]

Search engines don't claim to provide answers. They search for documents that match a query and provide a list of documents it has roughly in order of relevance.

If there's nothing answering what I was looking for, I might try again with synonyms, or the think documents aren't indexed, or they don't exist.

That's a very different failure mode than blatantly lying to me. By lying to me, I'm not blaming myself, I'm blaming the AI.

scarface_74 2 days ago | parent | prev [-]

Yes I know hallucinations are a thing. But when I had problems lile that better prompting (don’t make assumptions) and telling it to verify all of its answers with web resources

For troubleshooting an issue my prompt is usually “I am trying to do debug an issue. I’m going to give you the error message. Ask me questions one by one to help me troubleshoot. Prefer asking clarifying questions to making assumptions”.

Once I started doing that, it’s gotten a lot better.

simonklitj 2 days ago | parent [-]

How are you going to prompt the AI overview?

scarface_74 2 days ago | parent [-]

Why would I use Google for this use case

“There's a library I use with extensive documentation- every method, parameter, event, configuration option conceivable is documented.”

This is the perfect use case for ChatGPT with web search. Besides aside from Google News, Google has been worthless to find any useful information for years because of SEO.

n4r9 2 days ago | parent [-]

The fact that you personally would use a different tool is surely neither here nor there. It's like wading into a conversation about car problems and telling everyone that you ride a motorbike.

zdragnar 2 days ago | parent | next [-]

Alas, there does seem to be a strong tradition of that on HN. The car example is apropos, though instead it's more like "why do you own a car? I live in a hyper dense urban utopia and never drive anywhere!"

JustExAWS 2 days ago | parent | prev [-]

I also don’t use a hammer when a screwdriver is at hand and is the most appropriate tool.

It’s the classic XYProblem.

n4r9 2 days ago | parent [-]

It's not an XY problem or anything to do with customer service. It's more of a UX problem. Users are being presented with highly convenient AI summaries that have a relatively high level of innaccuracy.

JustExAWS 2 days ago | parent [-]

It’s more like you are choosing to use a tool when for the use case cited, there are much better tools available. Maybe the new interactive “AI mode” for Google would be a better use case. But the web has been horrible for years trying to search for developer documentation instead of going to the canonical source because of all of the mirror sites that scrape content and add ads.