▲ | gojomo 4 days ago | |
The LLMs need the embedding function, benefit from growth, do the training – and then other uses get that embedding "for free". So an old down-pressure on sizes – internal training costs & resource limits – now weaker. And as long as LLMs are seeing benefits from larger embeddings, they'll become more common and available. (Of course via truncation/etc, no one is forced to use larger than works for them... but larger may keep becoming more common & available.) |