▲ | lrei 5 days ago | |||||||
Warning: This is AI generated, probably a low end model as some of the content is outright nonsense eg: """ concept of MoE is quite prevalent (refer Outrageously Large Neural Networks: the Sparsely-Gated Mixture-of-Experts Layer), with Langchain’s high-level implementation of an LLMRouterChain, and notable low-level integrated examples """ | ||||||||
▲ | kafkaesque 5 days ago | parent | next [-] | |||||||
Is it possible to label/tag these submissions as containing content that is AI-generated? I think the HN community would appreciate that | ||||||||
| ||||||||
▲ | fragmede 5 days ago | parent | prev [-] | |||||||
The paper itself is fairly popular, with several thousand citations. Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer Noam Shazeer, Azalia Mirhoseini, Krzysztof Maziarz, Andy Davis, Quoc Le, Geoffrey Hinton, Jeff Dean |