| ▲ | machinecontrol 7 hours ago | ||||||||||||||||
The trend is obviously towards larger and larger context windows. We moved from 200K to 1M tokens being standard just this year. This might be a complete non issue in 6 months. | |||||||||||||||||
| ▲ | hrmtst93837 5 hours ago | parent | next [-] | ||||||||||||||||
Those bigger windows come with lovely surcharges on compute, latency, and prompt complexity, so "just wait for more tokens" is a nice fantasy that melts the moment someone has to pay the bill. If your use case is tiny or your budget is infinite, fine, but for everyone else the "make the window bigger" crowd sounds like they're budgeting by credit card. Quality still falls off near the edge. | |||||||||||||||||
| ▲ | amzil 6 hours ago | parent | prev [-] | ||||||||||||||||
Context windows getting bigger doesn't make the economics go away. Tokens still cost money. 50K tokens of schemas at 1M context is the same dollar cost as 50K tokens at 200K context, you just have more room left over. The pattern with every resource expansion is the same: usage scales to fill it. Bigger windows mean more integrations connected, not leaner ones. Progressive disclosure is cheaper at any window size. | |||||||||||||||||
| |||||||||||||||||