From fe444ce8f676139b32711df24cb22e57401b3b77 Mon Sep 17 00:00:00 2001 From: Samuel Path Date: Sun, 26 May 2024 15:08:06 +0200 Subject: [PATCH] typo Missing letter --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index e6c81db..1898e68 100644 --- a/README.md +++ b/README.md @@ -448,7 +448,7 @@ Without these designs, modern image generation models like [Stable Diffusion](ht The [Word2Vec](/03-sequence-modeling/04-word2vec/03-word2vec.ipynb) model popularized the concept of text embeddings that preserve semantic and syntactic meaning by forcing models to create vector representations for concepts with interesting properties. -A commonly used example of the power of such embeddings is that the following equation holds true in the embedding space: Emedding("King") - Embedding("Man") + Embedding("Woman") = Embedding("Queen"). +A commonly used example of the power of such embeddings is that the following equation holds true in the embedding space: Embedding("King") - Embedding("Man") + Embedding("Woman") = Embedding("Queen"). Embeddings show us how the relationships between concepts can be represented in a highly condensed format.