
Wednesday Nov 22, 2023
Episode 11.11: Low-cost ways to adapt and specialise trained models. LoRA - Low Rank Adaptation.
LoRA stands for “Low-Rank Adaptations of LLMs”; it is a technique that involves making low-cost tweaks to the weights of trained models by adding small adjustments.
No comments yet. Be the first to say something!