Pour vous authentifier, privilégiez eduGAIN / To authenticate, prefer eduGAINeu

SIMINOLE 04 Discriminative learning

Europe/Paris
building 208, room 100 (LAL, Orsay)

building 208, room 100

LAL, Orsay

Balazs Kegl (LAL)
Description
ANR SIMINOLE project meeting
Participants
  • Ahmed Lasmar
  • Alexandre Chotard
  • Balazs Kegl
  • cecile Germain
  • Djalel Benbouzid
  • Matthias Brendel
  • Nikolaus Hansen
  • Olivier Cappé
  • Robert Busa-Fekete
  • Roman Poeschl
  • Rémi Bardenet
  • Verena Heidirch-Meisner
    • 1
      Welcome and news
      Orateur: M. Balazs Kegl (LAL)
      Transparents
    • 2
      Introduction
      We will introduce the research activities of the AppStat team not related to the Auger project. Most of the subjects will be treated in detail in the following talks. 1) Discriminative learning 2) Motivations from physics projects A) trigger design B) signal/background separation C) pattern recognition in particle detectors 3) Short introduction to Adaboost 4) Extensions and applications: A) bandit boosting B) ranking C) structured classification D) classification-based policy iteration E) the software package multiboost.org 5) Hyperparameter optimization 6) Sequential design of discriminant functions
      Orateur: M. Balazs Kegl (LAL)
      Transparents
    • 3
      Pattern recognition in new generation highly granular calorimeters
      Calorimeters at a future linear electron positron collider are conceived to achieve in combination with other sub-detectors a jet energy resolution of 3-4% for jet energies between 45 and 250 GeV. Such a resolution would be 2 times better than the jet energy resolution achieved at LEP. To meet this goal the calorimeters feature a very high lateral and longitudinal granularity.This granularity of the calorimeter allows for obtaining unprecedented detailed images of hadronic showers. This is on one hand indispensable to obtain the experimental goals at a future lepton collider and on the other hand of great interest for many other domains in science. A major goal in future studies is to optimise the particle separation in the detector by exploiting the new quality of information.
      Orateur: M. Roman Poeschl (LAL Orsay)
      Slides
    • 12:00
      Lunch Break
    • 4
      Bandit boosting
      In this talk we show how to use multi-armed bandits (MABs) to improve the computational complexity of AdaBoost. AdaBoost constructs a strong classifier in a stepwise fashion by selecting simple base classifiers and using their weighted "vote" to determine the final classification. We model this stepwise base classifier selection as a sequential decision problem, and optimize it with MABs where each arm represents a subset of the base classifier set. The MAB gradually learns the "usefulness" of the subsets, and selects one of the subsets in each iteration. AdaBoost then searches only this subset instead of optimizing the base classifier over the whole space. The main improvement of this paper over a previous approach is that we use an adversarial bandit algorithm instead of stochastic bandits. This choice allows us to prove a weak-to-strong-learning theorem which means that the proposed technique remains a boosting algorithm in a formal sense. We demonstrate on benchmark data sets that our technique can achieve a generalization performance similar to standard AdaBoost for a computational cost that is an order of magnitude smaller.
      Orateur: Dr Robert Busa-Fekete (LAL)
      Transparents
    • 5
      Hyperparameter optimization
      Hyperparameter optimization is an important yet often overlooked step in machine learning. Many practitioners tend to tune hyperparameters manually or using very simple and computationally inefficient techniques, such as complete exploration of a predefined parameter grid. In this talk we summarize out work on hyperparameter optimization using Bayesian global optimization techniques.
      Orateurs: Matthias Brendel (CNRS LAL), M. Rémi Bardenet (LAL, University Paris-Sud XI, IN2P3)
      Transparents
    • 6
      Coffee break
    • 7
      Sequential design of discriminant functions
      There are numerous applications where the computational requirements of classifying a test instance are as important as the performance of the classifier itself. Object detection in images and web page ranking are well-known examples. A more recent application domain with similar requirements is trigger design in high energy physics. In this talk we describe an algorithm that builds sparse decision DAGs (directed acyclic graphs) out of a list of base classifiers provided by an external learning method such as AdaBoost. The basic idea is to cast the DAG design task as a Markov decision process. Each instance can decide to use or to skip each base classifier, or to quit the process and classify the instance. The decision is based on the current state of the classifier being built. The result is a sparse decision DAG where the base classifiers are selected in a data-dependent way. The algorithm is competitive with state-of-the-art cascade detectors on three object-detection benchmarks, and it clearly outperforms them in the regime of low number of base classifiers. Unlike cascades, it is also readily applicable for multi-class classification. Beside outperforming classical cascade designs on benchmark data sets, the algorithm also produces interesting deep sparse structures where similar input data follows the same path in the DAG, and subpaths of increasing length represent features of increasing complexity.
      Orateur: M. Djalel Benbouzid (LAL, Université Paris-Sud 11, In2p3)
      Transparents
    • 8
      Free discussion
      Preparation of the T+18 review.