Remix.run Logo
ramkumarkb 7 days ago

Does this work with other open-source LLMs like Qwen3 or other OpenAI compatible LLM Apis?

simonw 7 days ago | parent [-]

The README says:

> For developers using local LLMs, LangExtract offers built-in support for Ollama and can be extended to other third-party APIs by updating the inference endpoints.

If you look in the code they currently have classes for Gemini and Ollama: https://github.com/google/langextract/blob/main/langextract/...

If you want to do structured data extraction with a wider variety of libraries I'm going to promote my LLM library and tool, which supports dozens of models for this via the plugins mechanism: https://llm.datasette.io/en/stable/schemas.html