How much do we understand the statistical tools used by Tevatron and LHC experiments?
The Large Hadron Collider is in its second year of successful operation, rapidly approaching one of its major physics goal: the discovery or exclusion of the Higgs boson and the related unraveling of the mechanism of electroweak symmetry breaking.
In order to meet the community's expectations, the experimental collaborations are using increasingly sophisticated statistical tools that are being developed in the last few years, thus optimizing the analysis gain out of LHC data.
location: HIT E41.1 (reaching E41.1)
time: 11:00-13:00
Advanced multivariate techniques, like Boosted Decision Trees are now omnipresent in every experimental analysis, favourably competing with traditional cut-based methods.
The lectures are meant as an introduction to statistical tools and machine learning methods, from the definition of Baeysian probability to likelihood maximization, ANN and BDTs, in the way they are used by the HEP community within the context of hadron collider experimental analysis.
Week 1 (30 Jan - 3 Feb): Statistical Mehods in Particle Physics
Week 2 (6 Feb - 10 Feb): Multivariate Algorithms