Everyone talks about Mixtral 8x7B, the new model from @MistralAI that uses Mixture of Experts (MoE). It is said to beat gpt-3.
12 Dec 2023, 20:34
Everyone talks about Mixtral 8x7B, the new model from @MistralAI that uses Mixture of Experts (MoE). It is said to beat gpt-3.5.
Want to try it? Head to @Libertai_DAI chat and select Mixtral on top right model selector before starting a conversation.
You're welcome ;)
Same news in other sources
1Aleph.imALEPH #574
12 Dec 2023, 20:36
Nearly forgot the direct url:
Oh and it's running on a decentralized cloud, @aleph_im , so no big company there to snoop on your data.
Nearly forgot the direct url:.
Nearly forgot the direct url: https://t.co/33gD1NaigE
Oh and it's running on a decentralized cloud, @aleph_im , so no big company there to snoop on your data.