Wednesday Nov 22, 2023
Episode 11.11: Low-cost ways to adapt and specialise trained models. LoRA - Low Rank Adaptation.
LoRA stands for “Low-Rank Adaptations of LLMs”; it is a technique that involves making low-cost tweaks to the weights of trained models by adding small adjustments.
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.