Remix.run Logo
WhitneyLand 5 days ago

Model context limits are not “artificial” as claimed.

The largest context window a model can offer at a given quality level depends on the context size the model was pretrained with as well as specific fine tuning techniques.

It’s not simply a matter of considering increased costs.

Der_Einzige 5 days ago | parent [-]

Context extension methods exist and work. Please educate yourself about these rather than confidentially saying wrong things.

WhitneyLand 3 days ago | parent [-]

Not sure what you’re disagreeing with? Context window size limits are not artificial. It takes real time/money/resources to increase them.

There are a few ways to approach the problem. Pre-training on longer context lengths I’ve already mentioned. Fine-tuning techniques (like LongRoPE) I’ve already mentioned.

Inference time context extension tricks I didn’t mention because the papers I’ve seen seem to suggest there’s often problems with quality or unfavorable tradeoffs.

There’s no magic way around these limits, it’s a real engineering problem.