Machine learning
Machine learning with Python!
We go through a list of machine learning exercises on Kaggle and other datasets in Python. We'll go through the basics of interfacing with Kaggle, downloading datasets from different websites, and start from the basics of logistic regressions, to CARTS, to decision trees, ensemble methods all the way to machine learning with multi-layer perceptrons (MLPS), convolutional neural networks (CNN), and recurrent neural networks (RNN).
Sometimes you might see blank or unfinished posts. I'll come back to them in time.
If you're keen to learn more about a topic , contact me or make a comment on the relevant post.
- Hybrid machine learning solution with Google Colab
- How many neurons and layers for a multilayer perceptron (MLP)?
- Bayes' theorem
- Cluster analysis
- kNN vs k-Means
- Kaggle: Credit risk (Model: Gradient Boosting Machine - LightGBM)
- Kaggle: Credit risk (Model: Random Forest)
- Kaggle: Credit risk (Model: Decision Tree)
- Kaggle: Credit risk (Model: Support Vector Machines)
- Kaggle: Credit risk (Model: Logit)
- Kaggle: Credit risk (Feature Engineering: Automated)
- Kaggle: Credit risk (Feature Engineering: Part 3)
- Kaggle: Credit risk (Feature Engineering: Part 2)
- Kaggle: Credit risk (Feature Engineering: Part 1)
- Kaggle: Credit risk (Exploratory Data Analysis)
- Machine learning model peformance metrics
- Adaptive Boosting vs Gradient Boosting
- Bagging vs Boosting
- Dealing with imbalanced datasets
- What are decision trees and CARTs?
- Cost functions, gradient descent, and gradient boost
- Mitigating model overfit
- Working with Kaggle datasets
- Steps for building a machine learning model