Remix.run Logo
williamimoh 2 days ago

Looks like long context isn’t a problem anymore

tamarru 2 days ago | parent [-]

Neither is cost, and latency, in the long-term. LLMs ultimately become more economically viable than they are now, and broaden the scope of every existing LLM-driven application (particularly STS, conversational AI, etc, etc.)