Remix.run Logo
Havoc 2 days ago

Interesting to see a pivot away from MoE by both IBM and mistral while the larger classes of SOTA of models all seem to be sticking to it.

Quick vibe check of it- 8B @ Q6 - seems promising. Bit of a clinical tone, but can see that being useful for data processing and similar. You don't really want a LLM that spams you with emojis sometimes...

embedding-shape 2 days ago | parent | next [-]

Makes sense, dense for small models, dense or MoE for larger ones, end up fitting various hardware setups pretty neatly, no need for MoE at smaller scale and dense too heavy at large scale.

npodbielski 2 days ago | parent | prev [-]

I never want LLM to span me with emojis. What is the use case for that? I find it highly annoying.

2ndorderthought 2 days ago | parent | next [-]

Shh people are paying for each token. Don't get them asking too many questions

Havoc 2 days ago | parent | prev [-]

Think it can be a plus in moderation. eg in openclaw it can add some character

But yea dislike that style where each heading and bullet point gets an emoji