Sunday Dec 10, 2023
Episode 11.25: MoE (Mixture of Experts) approaches to AI are already appearing. What they involve.
The AI wars are hotting up with much current disparagement of OpenAI, competition from Google Deepmind’s Gemini, and now a new version of the Mistral model based on MoE technology called “Mixtral” because it mixes experts. The pace of development is extraordinary and that before the AIs themselves get in on the act.
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.