# Disadvantages of Using Decision Trees

 Decision TreesKnowledge Center Next Topic
 Decision Trees > Forum > Disadvantages of Using Decision Trees

 Disadvantages of Using Decision TreesAnneke Zwart, Student (University), Netherlands, Moderator Although decision trees can be used for effective decision making on product development, research and development and innovation programs, there are several pitfalls in using this method. Storey mentions the following problems: TIME-INTENSIVE: Decision trees are able to categorize quickly once the tree has been built. However it often takes a lot of time to develop such a tree. ERROR PROPAGATION: this refers to the influence of uncertainties of variables on the uncertainty of the function that is based on those variables. Decision trees work by a series of decisions. But, when one of those decisions taken is not correct, from that point on every decision is likely to be wrong too and the possibility of returning to the right path is very small. The existence and importance of the problems mentioned differ across decision trees applications. ⇒ Can you think of other problems or issues associated with using decision trees? Source: Storey, J.. Decision Trees (PowerPoint)

 Disadvantages of Decision Trees Anonymous 1. Tree structure prone to sampling – While Decision Trees are generally robust to outliers, due to their tendency to overfit, they are prone to sampling errors. If sampled training data is somewhat different than evaluation or scoring data, then Decision Trees tend not to produce great results. 2. Tree splitting is locally greedy – At each level, tree looks for binary split such that impurity of tree is reduced by maximum amount. This is a greedy algorithm and achieves local optima. It may be possible, for example, to achieve less than maximum drop in impurity at current level, so as to achieve lowest possible impurity of final tree, but tree splitting algorithm cannot see far beyond the current level. This means that Decision Tree built is typically locally optimal and not globally optimal or best. 3. Optimal decision tree is NP-complete problem – Because of number of feature variables, potential number of split points, and large depth of tree, total number of trees from same input dataset is unimaginably humongous. Thus, not only tree splitting is not global, computation of globally optimal tree is also practically impossible.