Conveners
Data processing in HEP
- Jiří Franc (Department of Mathematics, FNSPE Czech Technical University in Prague)
Data processing in HEP
- Václav Kůs (Department of Mathematics, FNSPE Czech Technical University in Prague)
Data processing in HEP
- Jaroslav Bielčík (CTU FNSPE)
Reliability of separation methods based on learning with the teacher (supervised learning) is primarily studied by verifying the independence of the achieved results on selected parts of data sets used. For this purpose, both data exploited in the process of separator parameters settings as well as independent test data are used. For example, the first one data are frequently used in so...
When processing the HEP data it is often necessary to deal with the problem of high dimension of the dataset. Dimensionality reduction techniques represent a wise way of reducing the number of variables while preserving as much structure in the data as possible. This presentation will discuss the results of the implementation of a feature extraction method into the structure of a binary SDDT...
Multivariate Analysis (MVA) techniques using machine learning (ML) algorithms play an important role in many High Energy Physics (HEP) data analyses. In the last decades, the development of ML in HEP has lived its own life, but it changes. In the last couple of years, HEP community has discovered very powerful tools and methods from industry and adapt them to their unique and interesting...
In particle physics, homogeneity tests can be used to verify whether measured data's distribution corresponds to the distribution of a simulated Monte Carlo sample. However, Monte Carlo generator produces weighted samples because weights are used to modify the sample in order to take into account various attributes of a detector. Generalized homogeneity tests, such as Kolmogorov–Smirnov test,...
Homogeneity tests play a significant role in analysis of experimental data in high energy physics. These tests verify whether measured data samples and Monte Carlo simulated samples consistent with Standard Model come from the same distribution. Due to the fact that every simulated sample comes with a corresponding weight, we propose the modification of known homogeneity tests, namely...
With the raise of modern computing capabilities and new approaches in deep learning, we are able to design convolutional neural networks suitable for purposes of particle identification at NOvA Experiment in Fermilab. Utilizing deep learning techniques leads to the signifcant increase in signal effciency classifcation. We provide an overview of the experiment setup, raw data measurements and...
Data analysis in high energy physics includes solving difficult classification tasks; hence the deep learning approaches such as deep neural networks and convolutional neural networks (CNN) are often used. The core problems of particle identification share many similarities with the problems faced in computer vision. We describe the benefits of CNN in the area of image recognition tasks...
In this presentation, I would like to introduce TMVA optimization of KF Particle Finder. This is a framework we are now using for reconstruction of short lives particles at the STAR experiment. For this, I will briefly describe STAR detector and the measured variables which we need for the particle reconstruction. Also the basics of the KF Particle Finder will be introduced. At the end of the...
ROOT is a framework for large-scale data analysis that provides basic and advanced statistical methods used by the high-energy physics experiments. This framework includes machine learning algorithms from the ROOT-integrated Toolkit for Multivariate Analysis (TMVA). TMVA package is becoming widely used for data reconstruction at STAR experiment in Brookhaven National Laboratory.
Especially,...