Footnote

Another natural approach to classification is the decision tree or decision table. For example, one might use the following decision tree to distinguish the upper-case letters "A", "B", "C" and "D":

This method works best when there is no uncertainty in the feature values. It can be extended by employing "fuzzy features". For example, if the figure contains one large hole and one very small hole, it might be possible to give a score or fuzzy number or probability value between 0 and 1 as an answer to the question "Is the number of holes equal to zero?". Then one might choose some threshold, so that the answer is "Yes" if the score gets above the threshold. Alternatively, if the score falls in some "gray area", one might follow more than one path through the tree and develop scores for the various terminal nodes.

Decision trees are appealing for their simplicity and efficiency. However, they tend to become unmanageable for procesing sensory input data. When there is significant uncertainty, it is usually preferable to use a procedure designed for uncertainty than to "patch up" a decision tree by embellishing it with extra, ad hoc mechanisms.

Up arrow Back