- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Quantum Chromodynamics and Particle Interactions
- Particle Detector Development and Performance
- Distributed and Parallel Computing Systems
- Particle Accelerators and Free-Electron Lasers
- Dark Matter and Cosmic Phenomena
- Quantum Computing Algorithms and Architecture
- Quantum Information and Cryptography
- Scientific Computing and Data Management
- Neutrino Physics Research
- Advanced Data Storage Technologies
- Quantum optics and atomic interactions
- Computability, Logic, AI Algorithms
- Real-Time Systems Scheduling
- Cloud Computing and Resource Management
- Advanced Data Processing Techniques
- Laser-Matter Interactions and Applications
- Nuclear physics research studies
- Quantum and electron transport phenomena
- Spectroscopy and Quantum Chemical Studies
- Atomic and Molecular Physics
- Terahertz technology and applications
- Cold Atom Physics and Bose-Einstein Condensates
- Superconducting Materials and Applications
Skolkovo Institute of Science and Technology
2018-2024
École Polytechnique Fédérale de Lausanne
2012-2019
University of Tennessee at Knoxville
2018
European Organization for Nuclear Research
1986-1992
National Academies of Sciences, Engineering, and Medicine
1992
Kobe University
1992
The University of Tokyo
1991-1992
Université de Montréal
1991
Rutherford Appleton Laboratory
1991
Massachusetts Institute of Technology
1991
Abstract The Petaflops supercomputer “Zhores” recently launched in the “Center for Computational and Data-Intensive Science Engineering” (CDISE) of Skolkovo Institute Technology (Skoltech) opens up new exciting opportunities scientific discoveries institute especially areas data-driven modeling, machine learning artificial intelligence. This utilizes latest generation Intel NVidia processors to provide resources most compute intensive tasks Skoltech scientists working digital pharma,...
Over the past four years, Big Data and Exascale Computing (BDEC) project organized a series of five international workshops that aimed to explore ways in which new forms data-centric discovery introduced by ongoing revolution high-end data analysis (HDA) might be integrated with established, simulation-centric paradigm high-performance computing (HPC) community. Based on those meetings, we argue rapid proliferation digital generators, unprecedented growth volume diversity they generate,...
The quantum approximate optimization algorithm (QAOA) has become a cornerstone of contemporary applications development. Here we show that the \emph{density} problem constraints versus variables acts as performance indicator. Density is found to correlate strongly with approximation inefficiency for fixed depth QAOA applied random graph minimization instances. Further, required accurate solution instances scales critically density. Motivated by Google's recent experimental realization QAOA,...
Variational quantum algorithms are the centerpiece of modern programming. These involve training parameterized circuits using a classical co-processor, an approach adapted partly from machine learning. An important subclass these algorithms, designed for combinatorial optimization on currrent hardware, is approximate algorithm (QAOA). It known that problem density - constraint to variable ratio induces under-parametrization in fixed depth QAOA. Density dependent performance has been reported...
We analyze the potential of CERN Large Hadron Collider to study anomalous quartic vector-boson interactions through production pairs accompanied by jets. In framework $\mathrm{SU}{(2)}_{L}\ensuremath{\bigotimes}\mathrm{U}{(1)}_{Y}$ chiral Lagrangians, we examine all effective operators order ${p}^{4}$ that lead new four-gauge-boson but do not alter trilinear vertices. our analyses, perform full tree-level calculation processes leading two jets plus pairs,...
Abstract The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and 2011 been expanded with a wider range of applications. traditional accelerator physics simulation code SixTrack enjoys continuing volunteers support, thanks virtualisation number applications from the LHC experiment collaborations particle theory groups have joined consolidated project. This paper addresses challenges related virtualized in environment, how...
Since a couple of years, team at CERN and partners from the Citizen Cyberscience Centre (CCC) have been working on project that enables general physics simulation programs to run in virtual machine volunteer PCs around world. The uses Berkeley Open Infrastructure for Network Computing (BOINC) framework. Based CERNVM job management framework Co-Pilot, this was made available public beta-testing August 2011 with Monte Carlo simulations LHC under name "LHC@home 2.0" BOINC project:...
Abstract Gibbs sampling is fundamental to a wide range of computer algorithms. Such algorithms are set be replaced by physics based processors—be it quantum or stochastic annealing devices—which embed problem instances and evolve physical system into low-energy ensemble recover probability distribution. At critical constraint variable ratio, satisfiability (SAT) exhibit SAT-UNSAT transition (frustrated frustration free). Algorithms require increasing computational resources from this point....
SixTrack is a single-particle tracking code for high-energy circular accelerators routinely used at CERN the Large Hadron Collider (LHC), its luminosity upgrade (HL-LHC), Future Circular (FCC) and Super Proton Synchrotron (SPS) simulations. The based on 6D symplectic engine, which optimized long-term simulations delivers fully reproducible results several platforms. It also includes multiple scattering engines beam–matter interaction studies, as well facilities to run integrated with...
Abstract SixTrack Version 5 is a major release that introduces new features, with improved integration of the existing ones, and extensive code restructuring. New features include dynamic-memory management, scattering-routine integration, initial-condition module, reviewed post-processing methods. Existing like on-line aperture checking Fluka-coupling are now enabled by default. Extensive performance regression tests have been developed deployed as part new-release generation. The tracking...
The construction of quantum circuits to simulate Hamiltonian evolution is central many algorithms. State-of-the-art are based on oracles whose implementation often omitted, and the complexity algorithm estimated by counting oracle queries. However, in practical applications, an contributes a large constant factor overall algorithm. key finding this work efficient procedure for representation tridiagonal matrix Pauli basis, which allows one construct circuit without use oracles. represents...
We explore the utilization of higher-order discretization techniques in optimizing gate count needed for quantum computer based solutions partial differential equations. To accomplish this, we present an efficient approach decomposing $d$-band diagonal matrices into Pauli strings that are grouped mutually commuting sets. Using numerical simulations one-dimensional wave equation, show methods can reduce number qubits necessary discretization, similar to classical case, although they do not...
A new numerical approach for consideration of quantum dynamics and calculations the average values operators time correlation functions in Wigner representation statistical mechanics has been developed. The have presented form integrals Weyl's symbol considered Fourier transforms product matrix elements dynamic propagators. For latter integral Wigner–Liouville's type equation derived. initial condition this obtained transform Wiener path propagators at time. procedure solving combining both...
We describe the implementation of monitoring for IT systems at core autonomous driving vehicle. The role is to assist in decision start cycle and continuous assessment fitness drive requirements system with increased resiliency data replication make it sufficiently different from standard warrant a unique tuned requirements. combines OS events real-time measurements sensor data. information stored flat files emergency access as well Time Series Data Base (TSDB).