▲ | bob1029 4 days ago | |
This is also why SQL is cursed for LLMs. For queries that are actually valuable to the business we tend to have more constraints than these models can tolerate. By the time you get done explaining the meaning of your schema, you might have run out of context. Not that it would matter either way. I've never seen the attention mechanism lock onto more than ~10 hard constraints at a time. | ||
▲ | sjfjfjsjsj 3 days ago | parent [-] | |
Maybe you need to approach SQL the way code generation must be approached: don’t develop the whole statement or script at once, instead put together a plan and execute it step-by-step. Not everything must be done via LLMs themselves. You could use one or more tools to help generate parts of the query. You might be interested in this: https://www.pedronasc.com/articles/lessons-building-ai-data-... |