Charles Explorer logo
🇬🇧

Softening Splits in Decision Trees using Simulated Annealing

Publication at Faculty of Mathematics and Physics |
2007

Abstract

Predictions computed by a classification tree are usually constant on axis-parallel hyperrectangles corresponding to the leaves and have strict jumps on their boundaries. Frequently a better approximation may be expected, if the prediction function of the original tree is replaced by a continuous approximation.

The approximation is constructed using the same training data on which the original tree was grown and the structure of the tree is preserved. The current paper uses the model of trees with soft splits suggested by Quinlan and implemented in C4.5, however, the training algorithm is substantially different.

The method uses simulated annealing, so it is quite computationally expensive. However, numerical test with data derived from an experiment in particle physics shows that besides the expected better approximation of the training data, also smaller generalization error is achieved.