David Griffin

ORCID: 0000-0002-4077-0005
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Real-Time Systems Scheduling
  • Embedded Systems Design Techniques
  • Neural Networks and Reservoir Computing
  • Advanced Memory and Neural Computing
  • Parallel Computing and Optimization Techniques
  • Neural Networks and Applications
  • Formal Methods in Verification
  • Petri Nets in System Modeling
  • Distributed systems and fault tolerance
  • Fault Detection and Control Systems
  • Software Reliability and Analysis Research
  • Water Quality Monitoring Technologies
  • DNA and Biological Computing
  • Ferroelectric and Negative Capacitance Devices
  • Opinion Dynamics and Social Influence
  • Artificial Intelligence in Games
  • Network Time Synchronization Technologies
  • Distributed and Parallel Computing Systems
  • Chemical synthesis and alkaloids
  • Blind Source Separation Techniques
  • Magnetic properties of thin films
  • Advanced Data Processing Techniques
  • Advanced Software Engineering Methodologies
  • Advanced Chemical Sensor Technologies
  • Hydrogels: synthesis, properties, applications

University of York
2014-2025

University of Virginia
2025

Neural networks have revolutionized the area of artificial intelligence and introduced transformative applications to almost every scientific field industry. However, this success comes at a great price; energy requirements for training advanced models are unsustainable. One promising way address pressing issue is by developing low-energy neuromorphic hardware that directly supports algorithm's requirements. The intrinsic non-volatility, non-linearity, memory spintronic devices make them...

10.1063/5.0119040 article EN cc-by Applied Physics Letters 2023-01-23

This paper introduces the Dirichlet-Rescale (DRS) algorithm. The DRS algorithm provides an efficient general-purpose method of generating n-dimensional vectors components (e.g. task utilizations), where sum to a specified total, each component conforms individual constraints on maximum and minimum values that it can take, are uniformly distributed over valid region domain all possible vectors, bounded by constraints.The be used improve nuance quality empirical studies into effectiveness...

10.1109/rtss49844.2020.00018 article EN 2020-12-01

Granular hydrogel scaffolds hold significant potential in regenerative medicine, functioning either as carriers for cell delivery or interfaces tissue integration. This article introduces two novel approaches quantifying migration within and into granular hydrogels, highlighting the distinct applications of these scaffolds. First, a monolayer interface assay that simulates growth hydrogels integration purposes is presented. Second, spheroid-based described, designed to track movement matrix,...

10.3791/67627 article EN Journal of Visualized Experiments 2025-03-07

We describe “RingSim,” a phenomenological agent-based model that allows numerical simulation of magnetic nanowire networks with areas hundreds micrometers squared for durations seconds, practical impossibility general-purpose micromagnetic tools. In RingSim, domain walls (DWs) are instanced as mobile agents, which respond to external fields, and their stochastic interactions pinning sites other DWs described via simple rules. first present detailed description the its algorithmic...

10.1063/5.0251692 article EN cc-by Journal of Applied Physics 2025-04-01

This paper considers the use of Extreme Value Theory (EVT) to model worst-case execution times. In particular it sacrifice that statistical methods make in realism their models order provide generality and precision, if can impact safety model. The Gumbel distribution is assessed terms its assumption continuous behaviour need for independent identically distributed data. To ensure predictions made by EVT estimations are safe, additional restrictions on proposed justified.

10.4230/oasics.wcet.2010.44 article EN Worst-Case Execution Time Analysis 2010-01-01

Abstract Devices based on arrays of interconnected magnetic nano-rings with emergent magnetization dynamics have recently been proposed for use in reservoir computing applications, but them to be computationally useful it must possible optimise their dynamical responses. Here, we a phenomenological model demonstrate that such reservoirs can optimised classification tasks by tuning hyperparameters control the scaling and input-rate data into system using rotating fields. We task-independent...

10.1088/1361-6528/ac87b5 article EN cc-by Nanotechnology 2022-08-08

This paper describes the motivation, design, analysis and implementation of a new protocol for critical wireless communication called AirTight. Wireless has become crucial part infrastructure many cyber-physical applications. Many these applications are real-time also mixed-criticality, in that they have components/subsystems with different consequences failure. is inevitably subject to levels external interference. In this we represent interference using criticality-aware fault model; each...

10.1109/rtcsa.2018.00017 article EN 2018-08-01

This paper introduces an effective Static Probabilistic Timing Analysis (SPTA) for multi-path programs. The analysis estimates the temporal contribution of evict-on-miss, random replacement cache to probabilistic Worst-Case Execution Time (pWCET) distribution uses a conservative join function that provides proper overapproximation possible contents and pWCET on path convergence, irrespective actual followed during execution. Simple program transformations are introduced reduce impact...

10.1109/rtss.2015.41 preprint EN 2015-12-01

The analysis of random replacement caches is an area that has recently attracted considerable attention in the field probabilistic real-time systems. A major problem with performing static on such a cache relatively large number successor states miss (equal to associativity) renders approaches as Collecting Semantics intractable. Other must contend non-trivial behaviours, non-independence accesses cache, which tends lead overly pessimistic or computationally expensive analyses.

10.1145/2659787.2659809 article EN 2014-10-01

A key issue with Worst-Case Execution Time (WCET) analyses is the evaluation of tightness and soundness results produced. In absence a ground truth, i.e. Actual WCET (AWCET), such evaluations rely on comparison between different estimates or observed values.

10.1145/2834848.2834858 preprint EN 2015-11-04

While there is significant interest in the use of COTS multicore platforms for Real-time Systems, has been very little terms practical methods to calculate interference multiplier (i.e. increase execution time due interference) between tasks on such systems. present two distinct challenges: firstly, variable competing shared resource as cache, and secondly complexity hardware mechanisms policies used, which may result a system difficult if not impossible analyse - assuming that exact details...

10.1145/3139258.3139275 article EN 2017-10-04

Abstract Physical reservoir computing (RC) is a machine learning technique that ideal for processing of time dependent data series. It also uniquely well-aligned to in materio realisations allow the inherent memory and non-linear responses functional materials be directly exploited computation. We have previously shown square arrays interconnected magnetic nanorings are attractive candidates RC, experimentally demonstrated their strong performance range benchmark tasks (Dawidek et al 2021...

10.1088/2634-4386/ad53f9 article EN cc-by Neuromorphic Computing and Engineering 2024-06-01

Probabilistic hard real-time systems, based on hardware architectures that use a random replacement cache, provide potential means of reducing the over-provision required to accommodate pathological scenarios and associated extremely rare, but excessively long, worst-case execution times can occur in deterministic systems. Timing analysis for probabilistic systems requires provision time (pWCET) estimates. The pWCET distribution be described as an exceedance function which gives upper bound...

10.1007/s11241-017-9295-2 article EN cc-by Real-Time Systems 2017-12-18

This article describes the motivation, design, analysis, and configuration of criticality-aware multi-hop wireless communication protocol AirTight. Wireless has become a crucial part infrastructure many cyber-physical applications. Many these applications are real-time also mixed-criticality, in that they have components/subsystems with different consequences failure. is inevitably subject to levels external interference. In this article, we represent interference using fault model; for each...

10.1145/3362987 article EN ACM Transactions on Cyber-Physical Systems 2019-12-12

This paper outlines how Lossy Compression, a branch of Information Theory relating to the compact representation data while retaining important information, can be applied Worst Case Execution Time analysis problem. In particular, we show that by applying lossy compression structures involved in collecting semantics given component, for example PLRU cache, useful derived. While such an could found via other means, application Compression provides formal method and eases process discovering...

10.1145/2659787.2659807 article EN 2014-10-01

Abstract Timing verification of multi-core systems is complicated by contention for shared hardware resources between co-running tasks on different cores. This paper introduces the Multi-core Resource Stress and Sensitivity (MRSS) task model that characterizes how much stress each places it sensitive to such resource stress. facilitates a separation concerns, thus retaining advantages traditional two-step approach timing (i.e. analysis followed schedulability analysis). Response time derived...

10.1007/s11241-022-09377-8 article EN cc-by Real-Time Systems 2022-02-19

A benefit with traditional static analysis approaches to single criticality hard real-time systems is that the uncertainties, and hence confidence, associated timing requirements being met are better understood than Measurement-Based Timing Analysis (MBTA) approaches. In brief, failures mostly accounted for by human errors or random hardware failures. With introduction of measurement-based help deal more advanced processors, situation much complex. The complexity comes from new sources...

10.1145/3394810.3394816 article EN 2020-06-09

This paper investigates an edge computing system where requests are processed by a set of replicated servers. We investigate class applications similar queries produce identical results. To reduce processing overhead on the servers we store results previous computations and return them when new sufficiently to earlier ones that produced results, avoiding necessity every query. implement similarity-based data classification system, which evaluate based real-world datasets images voice...

10.48550/arxiv.2405.17263 preprint EN arXiv (Cornell University) 2024-05-27

We describe 'RingSim', a phenomenological agent-based model that allows numerical simulation of magnetic nanowire networks with areas hundreds micrometers squared for durations seconds; practical impossibility general-purpose micromagnetic tools. In RingSim, domain walls (DWs) are instanced as mobile agents which respond to external fields, and their stochastic interactions pinning sites other DWs described via simple rules. first present detailed description the its algorithmic...

10.48550/arxiv.2410.22204 preprint EN arXiv (Cornell University) 2024-10-29

Given that real-time systems are specified to a degree of confidence, budget overruns should be expected occur in system at some point. When overrun occurs, it is necessary understand how long such state persists, order determine if the fault tolerance adequate handle problem. However, given rarity testing, cannot assumed sufficient data will available build an accurate model. Hence this paper presents new application Markov Chain based modelling techniques combined with forecasting...

10.1145/2834848.2834870 preprint EN 2015-11-04
Coming Soon ...