▲ | simonw 7 days ago | |
The README says: > For developers using local LLMs, LangExtract offers built-in support for Ollama and can be extended to other third-party APIs by updating the inference endpoints. If you look in the code they currently have classes for Gemini and Ollama: https://github.com/google/langextract/blob/main/langextract/... If you want to do structured data extraction with a wider variety of libraries I'm going to promote my LLM library and tool, which supports dozens of models for this via the plugins mechanism: https://llm.datasette.io/en/stable/schemas.html |