22 episodes

Code logic is all about logic development and it aims at improving problem solving skills, by listening to this podcast you will get to know how to think and decompose a difficult problem into multiple simpler problems...

# Code Logic Sarvesh Bhatnagar

• Education

Code logic is all about logic development and it aims at improving problem solving skills, by listening to this podcast you will get to know how to think and decompose a difficult problem into multiple simpler problems...

Collocations, Part Two (S3E2)

## Collocations, Part Two (S3E2)

Hey guys, this is sarvesh again, I hope you enjoyed the recent episodes and as promised, this is another episode for collocation's. These techniques are generally and most commonly used for finding if the words are collocations or not. The generally principle behind this is hypothesis testing. Our null hypothesis being, assumption that the words are not collocations. To find out more, go ahead and listen to the episode.

#NLP #NaturalLanguageProcessing #Learn #Something #New

---

Send in a voice message: https://anchor.fm/sarvesh-bhatnagar/message

• 7 min
Collocations, Part One (S3E1)

## Collocations, Part One (S3E1)

Hey guys, sorry for the long break. I was working from home and its difficult to record audio at home, anyways I have started the podcast back, I think we will be discussing about word embeddings and other embeddings later because we have a lot to learn before that.

About this episode, this episode focuses about collocations, what comprises of collocations and two methods for collocations the first is based on frequency and second is based on mean and variance. I hope you learn something from this episode and enjoy it. See you in the next episode.

We are starting a fresh, with season 3 because I think we were learning in an unclean manner. I always strive to make good content for my listeners and will try to make it as decent as possible. I might reupload previous episodes again. Good luck, Have fun and happy new year!

#collocations #learn #learnsomethingnew #NLP

---

Send in a voice message: https://anchor.fm/sarvesh-bhatnagar/message

• 13 min
Word Embeddings - A simple introduction to word2vec

## Word Embeddings - A simple introduction to word2vec

Hey guys welcome to another episode for word embeddings! In this episode we talk about another popularly used word embedding technique that is known as word2vec. We use word2vec to grab the contextual meaning in our vector representation. I've found this useful reading for word2vec. Do read it for an in depth explanation.

p.s. Sorry for always posting episode after a significant delay, this is because I myself am learning various stuffs, I have different blogs to handle, multiple projects that are in place so my schedule almost daily is kinda packed. I hope you all get some value from my podcasts and helps you get an intuitive understanding of various topics.

See you in the next podcast episode!

---

Send in a voice message: https://anchor.fm/sarvesh-bhatnagar/message

• 4 min
Introduction to word embeddings and One hot encoding in NLP

## Introduction to word embeddings and One hot encoding in NLP

In this podcast episode we discuss about why word embeddings are required, what are they and we also discuss about one hot encodings. In next episode we will talk about specific techniques for word embeddings individually. Stay tuned.

Sponsored by www.stacklearn.org

---

Send in a voice message: https://anchor.fm/sarvesh-bhatnagar/message

• 2 min
learn about TF-IDF model in Natural Language Processing

## learn about TF-IDF model in Natural Language Processing

In this podcast episode we will talk about TF-IDF model in Natural Language Processing. TF-IDF model stands for term frequency inverse document frequency. We use TF-IDF model to give more weight to important words as compared with common words like the, a, in, there, where, etc. To learn python programming visit www.stacklearn.org. See you in the next podcast episode!

---

Send in a voice message: https://anchor.fm/sarvesh-bhatnagar/message

• 1 min
Bag of Words in Natural Language Processing

## Bag of Words in Natural Language Processing

In this podcast episode we talk about bag of words model in natural language processing. Bag of Words model is simply a feature extraction method used in NLP. We mainly discuss about why bag of words model is required and what it is. In summary BOW is simply a set of tuples with words along with their frequency pairs.

To learn more about BOW : visit this

Gensim Introduction : visit this

Also, to support me do visit www.stacklearn.org

---

Send in a voice message: https://anchor.fm/sarvesh-bhatnagar/message

• 4 min