Personal uploaded space for the 100Day-ML-Marathon, which is a marathon Challenge about training your machine learning skill on Kaggle in 100 consecutive days.
📜 My Certificate of Completion: PDF
- Day 001 : Data Introduction and Assessment
- Day 002 : Exploratory Data Analysis(EDA)
- Day 003 : Build Pandas DataFrame
- Day 004 : Pandas Data Types
- Day 005 : EDA Distribution
- Day 006 : Handle Outlier Data
- Day 007 : Normalize Continuous Data
- Day 008 : DataFrame operation / Data frame merge
- Day 009 : EDA Correlation 1
- Day 010 : EDA Correlation 2
- Day 011 : Kernal Density Estimation (KDE)
- Day 012 : Discretization Method
- Day 013 : Implement Discretization Method
- Day 014 : Subplot using Matplotib
- Day 015 : Heatmap and Grid-plot
- Day 016 : Logistic Regression
- Day 017 : Introduction of Feature Engineering
- Day 018 : Feture Types
- Day 019 : [Value Type] Insert Value for Lost Information
- Day 020 : [Value Type] Remove Outlier
- Day 021 : [Value Type] Remove Bias
- Day 022 : [Class Type] One-Hot and Label Encoding
- Day 023 : [Class Type] Mean Encoding
- Day 024 : [Class Type] Other Advanced Processing
- Day 025 : [Time Type] Time Cycle
- Day 026 : Feature Combination (Value and Value)
- Day 027 : Feature Combination (Value and Class)
- Day 028 : Feature Selection
- Day 029 : Feature Estimation
- Day 030 : Leaf Encoding on Class Type Feature
- Day 031 : Introduction of Machine Learning
- Day 032 : Framework and Process in Machine Learning
- Day 033 : How to Teach Machine?
- Day 034 : Split Training and Evaluation Set
- Day 035 : Regression vs. Classification
- Day 036 : Evaluation Metrics
- Day 037 : Regression Model Introdoction (Linear / Logistic)
- Day 038 : Rgression Model Implement (Linear / Logistic)
- Day 039 : Regression Model Introdoction (LASSO / Ridge)
- Day 040 : Rgression Model Implement (LASSO / Ridge)
- Day 041 : Tree Based Model Introdoction (Decision Tree)
- Day 042 : Tree Based Model Implement (Decision Tree)
- Day 043 : Tree Based Model Introdoction (Random Forest)
- Day 044 : Tree Based Model Implement (Random Forest)
- Day 045 : Tree Based Model Introdoction (Gradient Boosting Machine)
- Day 046 : Tree Based Model Implement (Gradient Boosting Machine)
- Day 047 : Hyper-Parameters Tuning and Optimization
- Day 048 : Introduction of Kaggle
- Day 049 : Bleding Method
- Day 050 : Stacking Method
- Day 054 : Introduction of Unsupervised Learning
- Day 055 : Clustering Method
- Day 056 : K-Mean
- Day 057 : Hierarchical Clustering
- Day 058 : Hierarchical Clustering on 2D Toy Dataset
- Day 059 : Dimension Reduction - PCA
- Day 060 : PCA on MNIST
- Day 061 : Dimension Reduction - T-SNE
- Day 062 : T-SNE Implement
- Day 063 : Introduction of Neural Netork
- Day 064 : Experience on TensorFlow PlayGround (Learning Rate)
- Day 065 : Experience on TensorFlow PlayGround (Activation Function/ Regularization)
Deep Learning on Keras
- Day 066 : Introducion of Keras
- Day 067 : Keras Dataset
- Day 068 : Keras Sequential API
- Day 069 : Keras Module API
- Day 070 : Multi-Layer Perception (MLP)
- Day 071 : Loss Functions
- Day 072 : Activation Function
- Day 073 : Gradient Descend (1/2)
- Day 074 : Gradient Descend (2/2)
- Day 075 : Back Propagation
- Day 076 : Optimizers
- Day 077 : Validation and Overfitting
- Day 078 : KeyNote before Training Model
- Day 079 : Learning Rate Effect
- Day 080 : Combination of Optomizer and Learning Rate
- Day 081 : Avoid Overfitting - Regularization
- Day 082 : Avoid Overfitting - Dropout
- Day 083 : Avoid Overfitting - Batch Normalization
- Day 084 : Avoid Overfitting - Hyper-Parameters Tuning and Comparison
- Day 085 : Avoid Overfitting - Early Stop
- Day 086 : Saving and Restoring Model
- Day 087 : Learning Rate Decay
- Day 088 : Design your Keras Callbacks Function
- Day 089 : Design your Loss Funciton
- Day 090 : Image Recognition using Tranditional Computer Vsion Methods
- Day 091 : Image Recognition using Machine Learning Model
Convolutional Neural Network (CNN) in Deep Learning
- Day 092 : Introdoction of CNN (1/2)
- Day 093 : Introdoction of CNN (2/2)
- Day 094 : Parameters Tuning in CNN Layer
- Day 095 : Pooling Layer in Keras
- Day 096 : CNN Layer in Keras
- Day 097 : CNN vs. DNN on CIFAR-10
- Day 098 : Data Generator in Keras
- Day 099 : Data Augmentation in Keras
- Day 100 : Transfer Learning