Word Embeddings - A simple introduction to word2vec
JAN 13, 20214 MIN
Word Embeddings - A simple introduction to word2vec
JAN 13, 20214 MIN
Description
<p>Hey guys welcome to another episode for <strong>word embeddings</strong>! In this episode we talk about another popularly used word embedding technique that is known as <strong>word2vec</strong>. We use word2vec to<em><strong> grab the contextual meaning</strong></em> in our vector representation. I've found <a href="https://wiki.pathmind.com/word2vec">this</a> useful reading for word2vec. Do read it for an in depth explanation.</p>
<p>p.s. Sorry for always posting episode after a significant delay, this is because I myself am learning various stuffs, I have different blogs to handle, multiple projects that are in place so my schedule almost daily is kinda packed. I hope you all get some value from my podcasts and helps you get an intuitive understanding of various topics.</p>
<p>See you in the next podcast episode!</p>
---
Send in a voice message: https://podcasters.spotify.com/pod/show/sarvesh-bhatnagar/message