▲ | potato-peeler 5 days ago | |
> then you can do some processing or just hand over all the chunks as context saying "here are some documents use them to answer this question" + your query to the llm This part is what I want to understand. How does the llm “frame” an answer? | ||
▲ | fareesh 15 hours ago | parent [-] | |
I guess you could just try an equivalent in chatgpt or gemini or something. Paste 5 text files one after the other in some structured schema that includes metadata and ask a question. You can steer it with additional instructions like mention the filename etc etc. |