There are numerous applications where the computational requirements of classifying a test instance are as important as the performance of the classifier itself. Object detection in images and web page ranking are well-known examples. A more recent application domain with similar requirements is trigger design in high energy physics.
In this talk we describe an algorithm that builds sparse decision DAGs (directed acyclic graphs) out of a list of base classifiers provided by an external learning method such as AdaBoost. The basic idea is to cast the DAG design task as a Markov decision process. Each instance can decide to use or to skip each base classifier, or to quit the process and classify the instance. The decision is based on the current state of the classifier being built. The result is a sparse decision DAG where the base classifiers are selected in a data-dependent way. The algorithm is competitive with state-of-the-art cascade detectors on three object-detection benchmarks, and it clearly outperforms them in the regime of low number of base classifiers. Unlike cascades, it is also readily applicable for multi-class classification.
Beside outperforming classical cascade designs on benchmark data sets, the algorithm also produces interesting deep sparse structures where similar input data follows the same path in the DAG, and subpaths of increasing length represent features of increasing complexity.