- Particle physics theoretical and experimental studies
- High-Energy Particle Collisions Research
- Quantum Chromodynamics and Particle Interactions
- Particle Detector Development and Performance
- Dark Matter and Cosmic Phenomena
- Cosmology and Gravitation Theories
- Computational Physics and Python Applications
- Crystallization and Solubility Studies
- X-ray Diffraction in Crystallography
- Neutrino Physics Research
- Astrophysics and Cosmic Phenomena
- Distributed and Parallel Computing Systems
- Black Holes and Theoretical Physics
- Medical Imaging Techniques and Applications
- Particle Accelerators and Free-Electron Lasers
- Superconducting Materials and Applications
- Advanced Data Storage Technologies
- Scientific Computing and Data Management
- Nuclear reactor physics and engineering
- Gamma-ray bursts and supernovae
- Radiation Detection and Scintillator Technologies
- Radiation Therapy and Dosimetry
- Gas Dynamics and Kinetic Theory
- Cold Atom Physics and Bose-Einstein Condensates
- International Science and Diplomacy
International Institute of Information Technology
2024
Institute of High Energy Physics
2013-2023
University of California, San Diego
2014-2023
A. Alikhanyan National Laboratory
2023
University of Wisconsin–Madison
2008-2020
University of Iowa
2014-2015
University of California, Riverside
2012-2015
UC San Diego Health System
2014
University of Illinois Chicago
2014
University of California, Santa Barbara
2014
This document proposes a collection of simplified models relevant to the design new-physics searches at LHC and characterization their results. Both ATLAS CMS have already presented some results in terms models, we encourage them continue expand this effort, which supplements both signature-based benchmark model interpretations. A is defined by an effective Lagrangian describing interactions small number new particles. Simplified can equally well be described masses cross-sections. These...
The discovery by the ATLAS and CMS experiments of a new boson with mass around 125 GeV measured properties compatible those Standard-Model Higgs boson, coupled absence discoveries phenomena beyond Standard Model at TeV scale, has triggered interest in ideas for future factories. A circular e+e- collider hosted 80 to 100 km tunnel, TLEP, is among most attractive solutions proposed so far. It clean experimental environment, produces high luminosity top-quark, W Z studies, accommodates multiple...
This Report summarizes the proceedings of 2015 Les Houches workshop on Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant for high precision Standard Model calculations, (II) PDF4LHC parton distributions, (III) issues in theoretical description production Higgs bosons and how to relate experimental measurements, (IV) a host phenomenological studies essential comparing LHC data from Run I predictions projections future measurements II, (V) Monte Carlo event generators.
This Report summarizes the proceedings of 2015 Les Houches workshop on Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant for high precision Standard Model calculations, (II) PDF4LHC parton distributions, (III) issues in theoretical description production Higgs bosons and how to relate experimental measurements, (IV) a host phenomenological studies essential comparing LHC data from Run I predictions projections future measurements II, (V) Monte Carlo event generators.
Results are presented for a variety of SUSY Simplified Models at the 14 TeV LHC as well 33 and 100 proton collider. Our focus is on models whose signals driven by colored production. We present projections upper limit discovery reach in gluino-neutralino (for both light heavy flavor decays), squark-neutralino, gluino-squark Model planes. Depending model jets + $ E_T^{\mathrm{miss}} , mono-jet, or same-sign di-lepton search applied. The impact pileup explored. This study utilizes Snowmass...
We interpret within the phenomenological MSSM (pMSSM) results of SUSY searches published by CMS collaboration based on first ~1 fb^-1 data taken during 2011 LHC run at 7 TeV. The pMSSM is a 19-dimensional parametrization that captures most its features. It encompasses, and goes beyond, broad range more constrained models. Performing global Bayesian analysis, we obtain posterior probability densities parameters, masses derived observables. In contrast to constraints for particular breaking...
Given the increasingly more stringent bounds on Supersymmetry (SUSY) from LHC searches, we are motivated to explore situation in which only accessible SUSY states electroweakinos (charginos and neutralinos). In minimal framework, systematically study three general scenarios classified by relative size of gaugino mass parameters M_1, M_2 Higgsino parameter \mu, with six distinctive cases, four would naturally result a compressed spectrum nearly degenerate LSPs. We present relevant decay...
This report summarizes the work of Energy Frontier New Physics working group 2013 Community Summer Study (Snowmass).
We make a systematic and comparative study for the LHC ILC electroweakino searches in Minimal Supersymmetric Standard Model. adopt general bottom-up approach scan over parameter regions all three cases of lightest supersymmetric particle being Bino-, Wino-, Higgsino-like. The signal from pair production subsequent decay to Wh (h b\bar b) final state may yield sensitivity 95% C.L. exclusion (5sigma discovery) mass scale M_2, mu ~ 250-400 GeV (200-250 GeV) at 14 TeV with an luminosity 300...
During the first run, CMS collected and processed more than 10B data events simulated 15B events. Up to 100k processor cores were used simultaneously 100PB of storage was managed. Each month petabytes moved hundreds users accessed samples. In this document we discuss operational experience from run. We present workflows flows that executed, tools services developed, operations shift models sustain system. Many techniques followed original computing planning, but some reactions difficulties...
With the evolution of various grid federations, Condor glide-ins represent a key feature in providing homogeneous pool resources using late-binding technology. The CMS collaboration uses glide-in based Workload Management System, glideinWMS, for production (ProdAgent) and distributed analysis (CRAB) data. daemons traverse to worker nodes, submitted via Condor-G. Once activated, they preserve Master-Worker relationships, with first validating execution environment on node before pulling jobs...
During normal data taking CMS expects to support potentially as many 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote job computing infrastructure. The bulk these users will be supported at over 40 Tier-2 centres. Supporting globally distributed community on set clusters is task that requires reconsidering methods user for Analysis Operations. In formed an Support Task Force in preparation large-scale physics activities. charge...
After many years of preparation the CMS computing system has reached a situation where stability in operations limits possibility to introduce innovative features.Nevertheless it is same need and smooth that requires introduction features were considered not strategic previous phases.Examples are: adequate authorization control prioritize access storage resources; improved monitoring investigate problems identify bottlenecks on infrastructure; increased automation reduce manpower needed for...
We present a summary of results for SUSY Simplified Model searches at future proton colliders: the 14 TeV LHC as well 33 collider and 100 collider. Upper limits discovery significances are provided gluino-neutralino (for both light heavy flavor decays), squark-neutralino, gluino-squark planes. Events processed with Snowmass combined detector Standard backgrounds computed using samples. place emphasis on comparisons between different scenarios, along lessons learned regarding impact...
Significant funds are expended in order to make CMS data analysis possible across Tier-2 and Tier-3 resources worldwide. Here we review how monitors operational success using those resources, identifies understands problems, trends, provides feedback site operators software developers, generally accumulates quantitative on the aspects of analysis. This includes transfers, distribution, use releases for analysis, failure more.
In the framework of semi-hard ($k_t-$factorization) approach, we analyze various charm production processes in kinematic region covered by HERA experiments