Episodes
Friday Nov 24, 2023
Wednesday Nov 22, 2023
Wednesday Nov 22, 2023
LoRA stands for “Low-Rank Adaptations of LLMs”; it is a technique that involves making low-cost tweaks to the weights of trained models by adding small adjustments.
Wednesday Nov 22, 2023
Wednesday Nov 22, 2023
Large Language Models are pretty big. The smallest usable ones have around 7 billion parameters; chatGPT in its first (2021) version based on GPT3 had around 175 billion; many later ones are much larger. That limits what we can run on a local machine, but there are some workarounds.
Wednesday Nov 22, 2023
Monday Nov 20, 2023
Monday Nov 20, 2023
How we - and so AIs - decide what comes next; why it isn’t inevitable, necessary or inescapable, but chosen. And what is chosen can be changed.
Monday Nov 20, 2023
Monday Nov 20, 2023
A digression into general human matters as they affect the deployment of teams in complex projects.
Monday Nov 20, 2023
Monday Nov 20, 2023
Monday Nov 13, 2023
Monday Nov 13, 2023
What comes next after a prompt, and what makes one completion more appropriate than another?
Tuesday Nov 07, 2023
Tuesday Nov 07, 2023
Teachers of all levels need to resolve a particular conundrum, which is how to prepare children well enough for whatever challenges they may face in the future including examinations, but not to make them think what they have been taught is everything that they could possibly need to know. And the other is not preparing them well enough for whatever challenges they may need to face The example, in the episode is of memorising the solutions to all the crossword puzzles that have ever been set without learning how to do crossword puzzles. That will not enable you to do the next crossword puzzle, which I presented with. The same is true of preparing for exams: there is merit in doing some past papers, so that you understand what is expected of you and the structure, but to do every past paper, as though that would somehow prepare you in the absence of a real understanding of what the contents of the paper was, is not going to help you Of course, the opposite is not to know enough to have insufficient skills and that is called underfitting, under-educating, under-learning, not having reached the required standard.
