Remix.run Logo
simonw a day ago

I outsourced the "skim most of it" bit to the model. I used an LLM to jump to the bits that mattered, then I confirmed those bits by reading them myself in the original document (and thinking about them).

LLMs are a tool.

troupo a day ago | parent [-]

Yup, LLMs took you to recitals, not to the articles themselves. This is definitely better than invalid info, I'll grant you that.

simonw a day ago | parent [-]

I fed in the entire act with the articles and the recitals. The full response from Gemini included information from the articles, but I didn't quote that directly in my blog post. Here's that full response: https://gist.github.com/simonw/f2e341a2e8ea9ca75c6426fa85bc2...

Relevant section:

> Article 53(2) provides an exception from the obligations for providers of general-purpose AI models regarding technical documentation (Art 53(1)(a)) and providing information to downstream providers (Art 53(1)(b)) if the models are released under a free and open-source license and their parameters (including weights), information on model architecture, and information on model usage are made publicly available. This exemption does not apply to general-purpose AI models with systemic risks.