Word Embeddings - A simple introduction to word2vec


Episode Artwork
1.0x
0% played 00:00 00:00
Jan 13 2021 4 mins   2

Hey guys welcome to another episode for word embeddings! In this episode we talk about another popularly used word embedding technique that is known as word2vec. We use word2vec to grab the contextual meaning in our vector representation. I've found this useful reading for word2vec. Do read it for an in depth explanation.

p.s. Sorry for always posting episode after a significant delay, this is because I myself am learning various stuffs, I have different blogs to handle, multiple projects that are in place so my schedule almost daily is kinda packed. I hope you all get some value from my podcasts and helps you get an intuitive understanding of various topics.

See you in the next podcast episode!

--- Send in a voice message: https://podcasters.spotify.com/pod/show/sarvesh-bhatnagar/message