Impactful algorithm boosts accuracy of decision trees for better learning performance
Decision Tree algorithms struggle with small training sets because they split the data too much, leaving few examples at lower levels. To improve learning, a new algorithm called Importance Aided Decision Tree (IADT) uses Feature Importance scores to guide the tree-building process. By focusing on the most important attributes at each step, IADT creates more accurate and robust decision trees compared to traditional methods. Theoretical and empirical analyses show that IADT outperforms standard decision tree algorithms.