| ▲ | WillAdams 10 days ago |
| How does one configure an LLM interface using this to process multiple files with a single prompt? |
|
| ▲ | jumploops 10 days ago | parent | next [-] |
| Do you mean you want to process multiple files with a single LLM call or process multiple files using the same prompt across multiple LLM calls? (I would recommend the latter) |
| |
| ▲ | WillAdams 10 days ago | parent [-] | | Multiple files with a single LLM call. I have a prompt which works for a single file in Copilot, but it's slower than opening the file and looking at it to find one specific piece of information and re-saving it manually and then running a .bat file to rename with more of the information, then filling out the last two bits when entering things. |
|
|
| ▲ | 3abiton 10 days ago | parent | prev [-] |
| Depends what is your setup? You can always find more support on r/Localllama |
| |
| ▲ | WillAdams 9 days ago | parent [-] | | Using Copilot, and currently running jan.ai --- /r/Localllama seems to tend towards the typical Reddit cesspool. Let me rephrase: What locally-hosted LLM would be suited to batch processing image files? |
|