Hey guys welcome to another episode for word embeddings! In this episode we talk about another popularly used word embedding technique that is known as word2vec. We use word2vec to grab the contextual meaning in our vector representation. I've found this useful reading for word2vec. Do read it for an in depth explanation.
p.s. Sorry for always posting episode after a significant delay, this is because I myself am learning various stuffs, I have different blogs to handle, multiple projects that are in place so my schedule almost daily is kinda packed. I hope you all get some value from my podcasts and helps you get an intuitive understanding of various topics.
See you in the next podcast episode!
--- Send in a voice message: https://podcasters.spotify.com/pod/show/sarvesh-bhatnagar/messageIn this podcast episode we discuss about why word embeddings are required, what are they and we also discuss about one hot encodings. In next episode we will talk about specific techniques for word embeddings individually. Stay tuned.
Sponsored by www.stacklearn.org
--- Send in a voice message: https://podcasters.spotify.com/pod/show/sarvesh-bhatnagar/messageIn this podcast episode we will talk about TF-IDF model in Natural Language Processing. TF-IDF model stands for term frequency inverse document frequency. We use TF-IDF model to give more weight to important words as compared with common words like the, a, in, there, where, etc. To learn python programming visit www.stacklearn.org. See you in the next podcast episode!
--- Send in a voice message: https://podcasters.spotify.com/pod/show/sarvesh-bhatnagar/message