Vector Embedding
Vector Embedding
Vector Embedding
What is
vector
Embedding?
M Tanusri 01
Vector embeddings are numerical
representations of words or
objects in a continuous vector
space.
super fun
im happy
angry Love it
Enjoyy
Can’t tollerate
Messenger
Networking
Instagram
Capturing the semantic
meaning, allowing machines
to understand and process
language more effectively.
For example a word
"Honey" has a vector representation
[1.5, -0.4, 7.2, 19.6, 20.2]
Example-
"Happy" is similar to "enjoyed" are close to
each other where as "Sad" is opposite in its
meaning thus its far away in the space.
distance(d)
Happy [1.1, 2.3, 3.4, 01]
Angry [7.7, 8.8, 3.4, 9.9]
Enjoyed [1.1, 2.3, 3.4, 02]
What are its
applications and
why is it
important?
Vector embeddings are basic
fundamentals of
RAG (retrieval augmented generation)
which is indeed a backbone of building
any Ai applications
Vector embeddings are used in
various fields,
@Tanusri