Supervised Learning — Decision Tree
Single Decision Tree
Tree is constructed by splitting the tree on feature using methods like Gini index, Entropy and Misclassification error.
Decision trees can easily calculate how results were obtained . e.g. An tree based AI model predict that an item that came out from an assembly line is defective based on some measurements. It can easily be found the machine that may have have caused the defect.
Limitation of Single Decision Tree — Sensitive to training data, it can change drastically if the data points in the training set are removed or added.
Bootstrap Aggregation or Bagging
Input data is divided into n input sets and create n decision trees and train each tree with a different input set.
Result can be determine how majority votes or by averaging.
A very well used AI model for this model is Random Forest Machine Learning Model.
Boosting is another method that utilize the decision tree in sequence correcting the error in each decision .