Wednesday May 03, 2023

Episode 8.92: Could embeddings ever replace neural nets and training?

We consider the possibility that, as embedding vectors get longer and longer - we have already got to 1536 floating-point numbers and there are reports that elements of GPT-4 uses over 12,000 - could embeddings ever come to encapsulate so much of the semantics of everything that they embed everything between them.m? They become absolutely unique and as such capable of replacing the weights and even the layers of their neural nets? It’s an interesting idea

Comments (0)

To leave or reply to comments, please download free Podbean or

No Comments

Copyright 2025 All Rights Reserved

Podcast Powered By Podbean

Version: 20241125