Richard McClatchey

ORCID: 0000-0002-0042-5960
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Distributed and Parallel Computing Systems
  • Scientific Computing and Data Management
  • Semantic Web and Ontologies
  • Service-Oriented Architecture and Web Services
  • Business Process Modeling and Analysis
  • Particle physics theoretical and experimental studies
  • Advanced Database Systems and Queries
  • Quantum Chromodynamics and Particle Interactions
  • Advanced Data Storage Technologies
  • High-Energy Particle Collisions Research
  • Research Data Management Practices
  • Biomedical Text Mining and Ontologies
  • Parallel Computing and Optimization Techniques
  • Data Quality and Management
  • Medical Imaging Techniques and Applications
  • Cloud Computing and Resource Management
  • Advanced Software Engineering Methodologies
  • Particle Detector Development and Performance
  • Electronic Health Records Systems
  • Manufacturing Process and Optimization
  • Digital Radiography and Breast Imaging
  • AI in cancer detection
  • Distributed systems and fault tolerance
  • IoT and Edge/Fog Computing
  • Context-Aware Activity Recognition Systems

University of the West of England
2012-2021

European Organization for Nuclear Research
1989-2014

Austrian Academy of Sciences
2012

Institute of High Energy Physics
2012

Co-operative College
2009

National University of Sciences and Technology
2005

Bristol Clinical Commissioning Group
2004

Frenchay Hospital
2004

California Institute of Technology
2004

University West
2001

With the emergence of new methodologies and technologies it has now become possible to manage large amounts environmental sensing data apply integrated computing models acquire information intelligence. This paper advocates application cloud support information, communication decision making needs a wide variety stakeholders in complex business management urban regional development. The complexity is evident socio-economic interactions impacts embodied concept urban-ecosystem. highlights...

10.1186/2192-113x-1-1 article EN cc-by Journal of Cloud Computing Advances Systems and Applications 2012-01-01

Abstract This paper investigates the utility of unsupervised machine learning and data visualisation for tracking changes in user activity over time. is done through analysing unlabelled generated from passive ambient smart home sensors, such as motion which are considered less intrusive than video cameras or wearables. The challenge using sensors recognition to find practical methods that can provide meaningful information support timely interventions based on changing needs, without...

10.1007/s00521-020-04737-6 article EN cc-by Neural Computing and Applications 2020-01-25

Neuroscience is increasingly making use of statistical and mathematical tools to extract information from images biological tissues. Computational neuroimaging require substantial computational resources the increasing availability large image datasets will further enhance this need. Many efforts have been directed towards creating brain repositories including recent US Alzheimer Disease Neuroimaging Initiative. Multisite-distributed computing infrastructures launched with goal fostering...

10.2217/fnl.09.53 article EN Future Neurology 2009-11-12

10.1016/s0951-5240(98)00026-3 article EN Computer Integrated Manufacturing Systems 1998-10-01

E-learning can be loosely defined as a wide set of applications and processes, which uses available electronic media (and tools) to deliver vocational education training. With its increasing recognition an ubiquitous mode instruction interaction in the academic well corporate world, need for scaleable realistic model is becoming important. In this paper, we introduce SELF; semantic grid-based e-learning framework. SELF aims identify key-enablers practical environment minimize technological...

10.1109/ccgrid.2005.1558528 article EN 2005-01-01

Complex scientific workflows can process large amounts of data using thousands tasks. The turnaround times these are often affected by various latencies such as the resource discovery, scheduling and access for individual workflow processes or actors. Minimizing will improve overall execution time a thus lead to more efficient robust processing environment. In this paper, we propose pilot job concept that has intelligent reuse strategies minimize scheduling, queuing, latencies. results have...

10.1109/tns.2011.2146276 article EN IEEE Transactions on Nuclear Science 2011-05-27

Scientific workflows have become the primary mechanism for conducting analyses on distributed computing infrastructures such as grids and clouds. In recent years, focus of optimization within scientific has primarily been computational tasks workflow makespan. However, workflow-based analysis becomes ever more data intensive, is becoming a prime concern. Moreover, can scale along several dimensions: (i) number tasks, (ii) heterogeneity resources, (iii) size type (static versus streamed)...

10.1145/2451248.2451252 article EN ACM Transactions on Autonomous and Adaptive Systems 2013-04-01

Grid computing aims to provide an infrastructure for distributed problem solving in dynamic virtual organizations. It is gaining interest among many scientific disciplines as well the industrial community. However, current grid solutions still require highly trained programmers with expertise networking, high-performance computing, and operating systems. One of big issues full-scale usage a matching resource requirements job submission resources available on grid. Resource brokers...

10.5170/cern-2005-002.750 article EN 2004-10-01

Evidence-based medicine is critically dependent on three sources of information: a medical knowledge base, the patient's record and available resources, including where appropriate, clinical protocols. Patient data often scattered in variety databases may, distributed model, be held across several disparate repositories. Consequently addressing needs an evidence- based community presents issues biomedical integration, interpretation management. This paper outlines how Health-e-Child project...

10.1109/ideas.2007.45 article EN International Database Engineering and Applications Symposium 2007-09-06

Results from the research and development of a Data Intensive Network Aware (DIANA) scheduling engine, to be used primarily for data intensive sciences such as physics analysis, are described. In Grid analyses, tasks can involve thousands computing, handling, network resources. The central problem in these resources is coordinated management computation at multiple locations not just replication or movement. However, this prove rather costly operation efficient sing challenge if compute...

10.1109/tns.2006.886047 article EN IEEE Transactions on Nuclear Science 2006-12-01

The integration of heterogeneous biomedical information is one important step towards providing the level personalization required in next generation healthcare provision. In order to provide computer-based decision support systems needed access this integrated it will be necessary handle semantics (amongst other things) medical protocols. EC FP6 Health-e-Child project aims develop an platform for European paediatrics and tools personalized health information. This paper introduces both data...

10.1109/cbms.2008.90 article EN 2008-06-01
Coming Soon ...