Steinbeis-Transferzentrum Data Analytics und Predictive Modelling
VisualApp

Decision Tree

Decision trees make classification decisions through a sequence of simple yes/no questions. This app shows how the tree is built from training data and how depth and structure affect the decision boundary.

About the App

The app visualises how a decision tree (CART – Classification and Regression Tree) is built from training data. The algorithm recursively splits the feature space into rectangular regions — each split is determined by a simple threshold condition on one of the features.

The tree structure is shown below the visualisation area: each node displays the decision rule, the associated impurity (Gini index), and the number of training examples. The depth of the tree controls how finely the feature space is subdivided.

What can I do?

Use the slider at the top to set the maximum tree depth (1–4). A shallow tree makes simple decisions with little overfitting; a deep tree can capture more complex patterns but tends to overfit.

Data points can be moved, added, or removed — the tree is retrained on every change, and both the decision boundary and the tree diagram update in real time.

Interested in AI visualisations for teaching?

The VisualApps are created as a teaching and transfer project at Reutlingen University and are used in corporate training and talks.

Get in touch