
Wednesday Nov 22, 2023
Episode 11.10: Some remarks about the hardware resources needed to process Large Language Models.
Large Language Models are pretty big. The smallest usable ones have around 7 billion parameters; chatGPT in its first (2021) version based on GPT3 had around 175 billion; many later ones are much larger. That limits what we can run on a local machine, but there are some workarounds.
No comments yet. Be the first to say something!