Machine learning audio course, teaching the fundamentals of machine learning and artificial intelligence. It covers intuition, models (shallow and deep), math, languages, frameworks, etc. Where your other ML resources provide the trees, I provide the forest. Consider MLG your syllabus, with highly-curated resources for each episode's details at ocdevel.com. Audio is a great supplement during exercise, commute, chores, etc.
032 Cartesian Similarity Metrics
Social media Gnothi and email me a screenshot/link for 3-month access to Machine Learning Applied; commit code to the Github repository for life-access.
Normed distances link
A norm is a function that assigns a strictly positive length to each vector in a vector space. link Minkowski is generalized. p_root(sum(xi-yi)^p). "p" = ? (1, 2, ..) for below. L1: Manhattan/city-block/taxicab. abs(x2-x1)+abs(y2-y1). Grid-like distance (triangle legs). Preferred for high-dim space. L2: Euclidean. sqrt((x2-x1)^2+(y2-y1)^2. sqrt(dot-product). Straight-line distance; min distance (Pythagorean triangle edge) Others: Mahalanobis, Chebyshev (p=inf), etc Dot product
A type of inner product.
Outer-product: lies outside the involved planes. Inner-product: dot product lies inside the planes/axes involved link. Dot product: inner product on a finite dimensional Euclidean space link Cosine (normalized dot)
031 The Podcasts Return
The podcasts return with new content, especially about NLP: BERT, transformers, spaCy, Gensim, NLTK. Accompanied by a community project - Gnothi, a journal that uses AI to provide insights and resources. Website https://gnothiai.com, project https://github.com/lefnire/gnothi. Share the website on social media and email me a link/screenshot for free access to Machine Learning Applied for 3 months; contribute to the Github repository for free access for life.
030 Podcast Update
Show notes: https://ocdevel.com/mlg/30. Re-doing MLG, new podcast Machine Learning Applied, new project Gnothi, new resources page.
029 Reinforcement Learning Intro
Introduction to reinforcement learning concepts.
ocdevel.com/mlg/29 for notes and resources.
028 Hyperparameters 2
Hyperparameters part 2: hyper-search, regularization, SGD optimizers, scaling.
ocdevel.com/mlg/28 for notes and resources
027 Hyperparameters 1
Hyperparameters part 1: network architecture.
ocdevel.com/mlg/27 for notes and resources
Love it so far!
It’s all in the title. The collection of resources for follow up afterward is especially great.
Wonderful examples, enthusiasm, and resources
Misleading the new folks
It’s not a good idea to tell new folks not to try to understand the underlying algorithms, like random forests and gradient boosting. These are core methodologies which should have been explained better. Talking about unsupervised market basket analysis and PCA together doesn’t really make sense. No explicit mention of data engineering side.