Recursive partitioning


Recursive partitioning is a statistical method for multivariable analysis. Recursive partitioning creates a decision tree that strives to correctly classify members of the population by splitting it into sub-populations based on several dichotomous independent variables. The process is termed recursive because each sub-population may in turn be split an indefinite number of times until the splitting process terminates after a particular stopping criterion is reached.
.
The figures under the leaves show the probability of survival and the percentage of observations in the leaf.
Summarizing: Your chances of survival were good if you were
a female or a young boy without several family members.
Recursive partitioning methods have been developed since the 1980s. Well known methods of recursive partitioning include Ross Quinlan's ID3 algorithm and its successors, C4.5 and C5.0 and Classification and Regression Trees. Ensemble learning methods such as Random Forests help to overcome a common criticism of these methods - their vulnerability to overfitting of the data - by employing different algorithms and combining their output in some way.
This article focuses on recursive partitioning for medical diagnostic tests,
but the technique has far wider applications.
See decision tree.
As compared to regression analysis, which creates a formula that health care providers can use to calculate the probability that a patient has a disease, recursive partition creates a rule such as 'If a patient has finding x, y, or z they probably have disease q'.
A variation is 'Cox linear recursive partitioning'.

Advantages and disadvantages

Compared to other multivariable methods, recursive partitioning has advantages and disadvantages.
Examples are available of using recursive partitioning in research of diagnostic tests. Goldman used recursive partitioning to prioritize sensitivity in the diagnosis of myocardial infarction among patients with chest pain in the emergency room.