- Scientific Computing and Data Management
- Research Data Management Practices
- Advanced Data Storage Technologies
- Distributed and Parallel Computing Systems
- Data Quality and Management
- Cell Image Analysis Techniques
- Airway Management and Intubation Techniques
- Tracheal and airway disorders
- Cloud Computing and Resource Management
- Congenital Diaphragmatic Hernia Studies
- Semantic Web and Ontologies
- Injury Epidemiology and Prevention
- Seismic Performance and Analysis
- Human Mobility and Location-Based Analysis
- Structural Health Monitoring Techniques
- Seismology and Earthquake Studies
- Social Robot Interaction and HRI
- Spectroscopy and Quantum Chemical Studies
- Health disparities and outcomes
- Infrastructure Resilience and Vulnerability Analysis
- Hepatitis C virus research
- Medical Imaging Techniques and Applications
- Hydrology and Drought Analysis
- Traffic Prediction and Management Techniques
- Public Relations and Crisis Communication
Oak Ridge National Laboratory
2020-2025
Newcastle University
2011-2023
Naval Research Laboratory Information Technology Division
2023
Oak Ridge Leadership Computing Facility
2022
University College London Hospitals NHS Foundation Trust
2020
University College London
2020
Leicester Royal Infirmary
2020
Université de Bordeaux
2019
Rutgers, The State University of New Jersey
2019
The University of Texas at Arlington
2018
The 2015 Gorkha Nepal earthquake caused tremendous damage and loss. To gain valuable lessons from this tragic event, an investigation team was dispatched to 1 May 7 2015. A unique aspect of the is that first-hand data were obtained 6 11 days after mainshock. deeper understanding observed in Nepal, paper reviews seismotectonic setting regional seismicity analyzes available aftershock ground motion data. observations indicate majority damaged buildings stone/brick masonry structures with no...
Recent trends within computational and data sciences show an increasing recognition adoption of workflows as tools for productivity reproducibility that also democratize access to platforms processing know-how. As digital objects be shared, discovered, reused, benefit from the FAIR principles, which stand Findable, Accessible, Interoperable, Reusable. The Workflows Community Initiative's Working Group (WCI-FW), a global open community researchers developers working with across disciplines...
Short-duration, high-intensity rainfall causes significant disruption to transport operations, and climate change is projected increase the frequency intensity of these events. Disruption costs flooding are currently calculated using crude approaches. To support improved business cases for adapting urban infrastructure change, this paper presents an integrated framework that couples simulations calculate impacts disruption. A function, constructed from a range observational experimental data...
Memorial days of disasters represent an opportunity to evaluate the progress recovery. This article uses sentiment analysis (SA) assess post-disaster recovery on 10th anniversary L’Aquila’s earthquake using Twitter data. We have analyzed 4349 tweets from 4 10 April 2019 with hashtag: #L’Aquila that we obtained a third-party vendor. The polarity is first defined supervised classification based experts’ rules reconstruction and Grammarly tones. Then, this compared outcome unsupervised...
Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as back-end where deployment specialized analytical workflows takes place. However, neither nor client-side solution, by themselves or combined, is conducive to emergence open, collaborative, computational ecosystems image analysis that both...
In Chile, the Metropolitan Region of Santiago (RMS) is exposed to several natural and anthropogenic hazards. This means that not only there a constant need for healthcare, but also significant increase whenever its inhabitants are affected by disasters. The RMS problem lack healthcare infrastructure; rather, inequality in spatial distribution, which does consider location most vulnerable population, who may have greater needs. this paper, we performed Pearson's correlation multicollinearity...
COVIDTrach is a UK multicentre prospective cohort study project that aims to evaluate the outcomes of tracheostomy in patients with COVID-19 receiving mechanical ventilation and record incidence SARS-CoV-2 infection among healthcare workers involved procedure.Data on patient demographic, clinical history were entered prospectively updated over time via an online database (REDCap). Clinical variables compared outcomes, logistic regression used develop model for mortality. Participants...
COVID-19 has claimed more than 2.7 × 106 lives and resulted in over 124 infections. There is an urgent need to identify drugs that can inhibit SARS-CoV-2. We discuss innovations computational infrastructure methods are accelerating advancing drug design. Specifically, we describe several integrate artificial intelligence simulation-based approaches, the design of support these at scale. their implementation, characterize performance, highlight science advances capabilities have enabled.
Experimental and observational instruments for scientific research (such as light sources, genome sequencers, accelerators, telescopes electron microscopes) increasingly require High Performance Computing (HPC) scale capabilities data analysis workflow processing. Next-generation are being deployed with higher resolutions faster capture rates, creating a big crunch that cannot be handled by modest institutional computing resources. Often these pipelines also near real-time have resilience...
Ongoing advancements in cloud computing provide novel opportunities scientific computing, especially for distributed workflows. Modern web browsers can now be used as high-performance workstations querying, processing, and visualizing genomics' "Big Data" from sources like The Cancer Genome Atlas (TCGA) the International Consortium (ICGC) without local software installation or configuration. design of QMachine (QM) was driven by opportunity to use this pervasive model context Web Linked Data...
The FAIR principles for scientific data (Findable, Accessible, Interoperable, Reusable) are also relevant to other digital objects such as research software and workflows that operate on data. can be applied the being handled by a workflow well processes, software, infrastructure which necessary specify execute workflow. were designed guidelines, rather than rules, would allow differences in standards different communities degrees of compliance. There many practical considerations impact...
The rising popularity of computational workflows is driven by the need for repetitive and scalable data processing, sharing processing know-how, transparent methods. As both combined records analysis descriptions steps, should be reproducible, reusable, adaptable, available. Workflow presents opportunities to reduce unnecessary reinvention, promote reuse, increase access best practice analyses non-experts, productivity. In reality, are scattered difficult find, in part due diversity...
Recent trends within computational and data sciences show an increasing recognition adoption of workflows as tools for productivity, reproducibility, democratized access to platforms processing know-how. As digital objects be shared, discovered, reused, benefit from the FAIR principles, which stand Findable, Accessible, Interoperable, Reusable. The Workflows Community Initiative's Working Group (WCI-FW), a global open community researchers developers working with across disciplines domains,...
Modern large-scale scientific discovery requires multidisciplinary collaboration across diverse computing facilities, including High Performance Computing (HPC) machines and the Edge-to-Cloud continuum. Integrated data analysis plays a crucial role in discovery, especially current AI era, by enabling Responsible development, FAIR, Reproducibility, User Steering. However, heterogeneous nature of science poses challenges such as dealing with multiple supporting tools, cross-facility...
The prevalence of scientific workflows with high computational demands calls for their execution on various distributed computing platforms, including large-scale leadership-class high-performance (HPC) clusters. To handle the deployment, monitoring, and optimization workflow executions, many systems have been developed over past decade. There is a need benchmarks that can be used to evaluate performance current future software stacks hardware platforms.We present generator realistic...
COVID-19 has claimed more 1 million lives and resulted in over 40 infections. There is an urgent need to identify drugs that can inhibit SARS-CoV-2. In response, the DOE recently established Medical Therapeutics project as part of National Virtual Biotechnology Laboratory, tasked it with creating computational infrastructure methods necessary advance therapeutics development. We discuss innovations are accelerating advancing drug design. Specifically, we describe several integrate artificial...
The UK Earthquake Engineering Field Investigation Team (Eefit) was established as an independent society in 1982. Between 1984 and 2011, it carried out field missions to 29 earthquake zones, with reports on all of them freely available online. Over a hundred UK-based engineers have participated, split almost equally between industry academia. There been number significant benefits, including training through observations the practical effects ground shaking, fostering strong links practising...
Quantum Computers offer an intriguing challenge in modern Computer Science. With the inevitable physical limitations to Moore's Law, quantum hardware provides avenues solve grander problems faster by utilizing Mechanical properties at subatomic scales. These futuristic devices will likely never replace traditional HPC, but rather work alongside them perform complex tasks, best of decades HPC and computing research. We leverage capabilities scientific workflows make together. To demonstrate...
The FAIR Principles, originally introduced as guiding principles for scientific data management and stewardship, also apply abstractly to other digital objects such research software workflows. When the principles, most scientists can see that concepts behind — namely, make Findable, Accessible, Interoperable, Reusable will improve quality of artifacts. It is less common, however, immediately recognize ways in which incorporating methods into their enable them tackle problems...