Remix.run Logo
rldjbpin 6 days ago

i recall them being one of the first ones to release a mixture-of-experts (MoE) model [1], which was quite novel at the time. post that, it has appeared to be a catch-up game for them in mainstream utility. like just a week go they announced support for custom MCP connectors to their chat offering [2].

more competition is always nice, but i wonder what can these two companies, separated by several steps in the supply chain, really achieve together.

[1] https://mistral.ai/news/mixtral-of-experts [2] https://mistral.ai/news/le-chat-mcp-connectors-memories