Advertisement

Generative Models for Fast Cluster Simulations in the TPC for the ALICE Experiment

  • Kamil DejaEmail author
  • Tomasz Trzciński
  • Łukasz Graczykowski
  • for the ALICE Collaboration
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 945)

Abstract

Simulating the possible detector response is a key component of every high-energy physics experiment. The methods used currently for this purpose provide high-fidelity results. However, this precision comes at a price of a high computational cost, which renders those methods infeasible to be used in other applications, e.g. data quality assurance. In this work, we present a proof-of-concept solution for generating the possible responses of detector clusters to particle collisions, using the real-life example of the Time Projection Chamber (TPC) in the ALICE experiment at CERN. We introduce this solution as a first step towards a semi-real-time anomaly detection tool. It’s essential component is a generative model that allows to simulate synthetic data points that bear high similarity to the real data. Leveraging recent advancements in machine learning, we propose to use state-of-the-art generative models, namely Variational Autoencoders (VAE) and Generative Adversarial Networks (GAN), that prove their usefulness and efficiency in the context of computer vision and image processing. The main advantage offered by those methods is a significant speedup in the execution time, reaching up to the factor of \(10^3\) with respect to the GEANT3, a currently used cluster simulation tool. Nevertheless, this computational speedup comes at a price of a lower simulation quality. In this work we show quantitative and qualitative limitations of currently available generative models. We also propose several further steps that will allow to improve the accuracy of the models and lead to the deployment of anomaly detection mechanism based on generative models in a production environment of the TPC detector.

Notes

Acknowledgements

The authors acknowledge the support from the Polish National Science Centre grant no. UMO-2016/21/D/ST6/01946. The GPUs used in this work were funded by the grant of the Dean of the Faculty of Electronics and Information Technology at Warsaw University of Technology (project II/2017/GD/1).

The preliminary version of this paper was presented at the 3rd Conference on Information Technology, Systems Research and Computational Physics, 2–5 July 2018, Cracow, Poland [31].

References

  1. 1.
    Evans, L., Bryant, P.: LHC machine. JINST 3, S08001 (2008)CrossRefGoogle Scholar
  2. 2.
    Brun, R., Bruyant, F., Carminati, F., Giani, S., Maire, M., McPherson, A., Patrick, G., Urban, L.: GEANT Detector Description and Simulation Tool. CERN-W5013, CERN-W-5013, W5013, W-5013 (1994)Google Scholar
  3. 3.
    Agostinelli, S., et al.: GEANT4: a simulation toolkit. Nucl. Instrum. Meth. A506, 250–303 (2003)CrossRefGoogle Scholar
  4. 4.
    Ferrari, A., Sala, P.R., Fasso, A., Ranft, J.: FLUKA: a multi-particle transport code. CERN-2005-010, SLAC-R-773, INFN-TC-05-11 (2005)Google Scholar
  5. 5.
    Kingma, D.P., Welling, M.: Auto-encoding variational bayes, CoRR, vol. abs/1312.6114 (2013)Google Scholar
  6. 6.
    Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adversarial nets. In: NIPS (2014)Google Scholar
  7. 7.
    Dellacasa, G., et al.: ALICE: Technical Design Report of the Time Projection Chamber. CERN-OPEN-2000-183, CERN-LHCC-2000-001 (2000)Google Scholar
  8. 8.
    Abelev, B., et al.: Upgrade of the ALICE experiment: letter of intent. J. Phys. G41, 087001 (2014)CrossRefGoogle Scholar
  9. 9.
    Aamodt, K., et al.: The ALICE experiment at the CERN LHC. JINST 3, S08002 (2008)Google Scholar
  10. 10.
    Radford, A., Metz, L., Chintala, S.: Unsupervised representation learning with deep convolutional generative adversarial networks, CoRR, vol. abs/1511.06434 (2015)Google Scholar
  11. 11.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  12. 12.
    Karras, T., Aila, T., Laine, S., Lehtinen, J.: Progressive growing of gans for improved quality, stability, and variation, CoRR, vol. abs/1710.10196 (2017)Google Scholar
  13. 13.
    Reed, S., Akata, Z., Yan, X., Logeswaran, L., Schiele, B., Lee, H.: Generative adversarial text to image synthesis, CoRR, vol. abs/1605.05396 (2016)Google Scholar
  14. 14.
    Yan, X., Yang, J., Sohn, K., Lee, H.: Attribute2image: conditional image generation from visual attributes. In: ECCV (2016)Google Scholar
  15. 15.
    Kadurin, A., Aliper, A., Kazennov, A., Mamoshina, P., Vanhaelen, Q., Khrabrov, K., Zhavoronkov, A.: The cornucopia of meaningful leads: applying deep adversarial autoencoders for new molecule development in oncology. Oncotarget 8(7), 10883 (2017)CrossRefGoogle Scholar
  16. 16.
    Paganini, M., de Oliveira, L., Nachman, B.: CaloGAN: simulating 3d high energy particle showers in multi-layer electromagnetic calorimeters with generative adversarial networks, CoRR, vol. abs/1705.02355 (2017)Google Scholar
  17. 17.
    de Oliveira, L., Paganini, M., Nachman, B.: Learning particle physics by example: location-aware generative adversarial networks for physics synthesis. Comput. Softw. Big Sci. 1, 4 (2017)CrossRefGoogle Scholar
  18. 18.
    Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313(5786), 504–507 (2006)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. JMLR 15(1), 1929–1958 (2014)MathSciNetzbMATHGoogle Scholar
  20. 20.
    Glorot, X., Bordes, A., Bengio, Y.: Deep sparse rectifier neural networks. In: International Conference on Artificial Intelligence and Statistics, pp. 315–323 (2011)Google Scholar
  21. 21.
    Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: ICML (2015)Google Scholar
  22. 22.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization, CoRR, vol. abs/1412.6980 (2014)Google Scholar
  23. 23.
    Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: International Conference on Artificial Intelligence and Statistics, pp. 249–256 (2010)Google Scholar
  24. 24.
    Chollet, F., et al.: Keras (2015). https://github.com/fchollet/keras
  25. 25.
    Abadi, M., et al.: Tensorflow: a system for large-scale machine learning. In: 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 2016), pp. 265–283 (2016)Google Scholar
  26. 26.
    Lucic, M., Kurach, K., Michalski, M., Gelly, S., Bousquet, O.: Are gans created equal? A large-scale study, CoRR, vol. abs/1711.10337 (2017)Google Scholar
  27. 27.
    Sjostrand, T., Mrenna, S., Skands, P.Z.: PYTHIA 6.4 physics and manual. JHEP 05, 026 (2006)CrossRefGoogle Scholar
  28. 28.
    Skands, P.Z.: Tuning Monte Carlo generators: the perugia tunes. Phys. Rev. D 82, 074018 (2010)CrossRefGoogle Scholar
  29. 29.
    Suaide, A.A.D.P., Prado, C.A.G., Alt, T., Aphecetche, L., Agrawal, N., Avasthi, A., Bach, M., Bala, Barnafoldi, R., Bhasin, A., et al.: O2: a novel combined online and offline computing system for the alice experiment after 2018. J. Phys. Conf. Ser. 513, 012037 (2014). IOP PublishingGoogle Scholar
  30. 30.
    Schlegl, T., Seeböck, P., Waldstein, S.M., Schmidt-Erfurth, U., Langs, G.: Unsupervised anomaly detection with generative adversarial networks to guide marker discovery. In: International Conference on Information Processing in Medical Imaging, pp. 146–157. Springer (2017)Google Scholar
  31. 31.
    Deja, K., Trzciński, T., Graczykowski, Ł.: Generative models for fast cluster simulations in the TPC for the ALICE experiment. In: Kulczycki, P., Kowalski, P.A., Łukasik, S. (eds.) Contemporary Computational Science, p. 2. AGH-UST Press Cracow (2018)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Kamil Deja
    • 1
    Email author
  • Tomasz Trzciński
    • 1
  • Łukasz Graczykowski
    • 2
  • for the ALICE Collaboration
  1. 1.Institute of Computer ScienceWarsaw University of TechnologyWarsawPoland
  2. 2.Faculty of PhysicsWarsaw University of TechnologyWarsawPoland

Personalised recommendations