Rob Brennan

ORCID: 0000-0001-8236-362X
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Semantic Web and Ontologies
  • Data Quality and Management
  • Service-Oriented Architecture and Web Services
  • Biomedical Text Mining and Ontologies
  • Privacy-Preserving Technologies in Data
  • Privacy, Security, and Data Protection
  • Advanced Database Systems and Queries
  • Mobile Agent-Based Network Management
  • Big Data and Business Intelligence
  • Access Control and Trust
  • Natural Language Processing Techniques
  • Artificial Intelligence in Healthcare and Education
  • COVID-19 Digital Contact Tracing
  • Cloud Data Security Solutions
  • Context-Aware Activity Recognition Systems
  • Geographic Information Systems Studies
  • Scientific Computing and Data Management
  • Peer-to-Peer Network Technologies
  • Data Mining Algorithms and Applications
  • Blockchain Technology Applications and Security
  • Multi-Agent Systems and Negotiation
  • Digital and Cyber Forensics
  • Distributed systems and fault tolerance
  • COVID-19 diagnosis using AI
  • Imbalanced Data Classification Techniques

University College Dublin
2021-2024

Dublin City University
1999-2022

Trinity College Dublin
2011-2022

Science Foundation Ireland
2010-2022

Weatherford College
2021

Flint Institute Of Arts
2021

Data Fusion International (Ireland)
2021

Tokyo Electron (Ireland)
2000-2011

Ericsson (Ireland)
2005-2006

Orange (France)
2006

The vast amount of knowledge about past human societies has not been systematically organized and, therefore, remains inaccessible for empirically testing theories cultural evolution and historical dynamics. For example, what evolutionary mechanisms were involved in the transition from small-scale, uncentralized societies, which humans lived 10,000 years ago, to large-scale with an extensive division labor, great differentials wealth power, elaborate governance structures today? Why do...

10.21237/c7clio6127917 article EN cc-by Cliodynamics The Journal of Quantitative History and Cultural Evolution 2015-07-04

This paper presents a new method for data augmentation called Stride Random Erasing Augmentation (SREA) to improve classification performance. In SREA, probability based strides of one image are pasted onto another and also labels both images mixed with the same as mixing, generate augmented label. overcomes limitations popular random erasing method, where portion an is erased 0 or 255 mean dataset without considering location important feature(s) within image. A variety experiments have...

10.5121/csit.2022.120201 article EN 2022-01-29

Three key challenges to a whole-system approach process improvement in health systems are the complexity of socio-technical activity, capacity change purposefully, and consequent proactively manage govern system. The literature on healthcare demonstrates persistence these problems. In this project, Access-Risk-Knowledge (ARK) Platform, which supports implementation projects, was deployed across three organisations address risk management for prevention control healthcare-associated...

10.3390/ijerph182312572 article EN International Journal of Environmental Research and Public Health 2021-11-29

Forged documents specifically passport, driving licence and VISA stickers are used for fraud purposes including robbery, theft many more. So detecting forged characters from is a significantly important challenging task in digital forensic imaging. detection has two big challenges. First challenge is, data extremely difficult to get due several reasons limited access of data, unlabeled or work done on private data. Second deep learning (DL) algorithms require labeled which poses further as...

10.5121/ijaia.2022.13202 article EN International Journal of Artificial Intelligence & Applications 2022-03-31

Deep learning (DL) algorithms have shown significant performance in various computer vision tasks. However, having limited labelled data lead to a network overfitting problem, where is bad on unseen as compared training data. Consequently, it limits improvement. To cope with this techniques been proposed such dropout, normalization and advanced augmentation. Among these, augmentation, which aims enlarge the dataset size by including sample diversity, has hot topic recent times. In article,...

10.48550/arxiv.2301.02830 preprint EN other-oa arXiv (Cornell University) 2023-01-01

This paper describes a semi-automated process, framework and tools for harvesting, assessing, improving maintaining high-quality linked-data. The framework, known as DaCura1, provides dataset curators, who may not be knowledge engineers, with to collect curate evolving linked data datasets that maintain quality over time. encompasses novel workflow architecture. A working implementation has been produced applied firstly the publication of an existing social-sciences dataset, then harvesting...

10.4018/ijswis.2014040103 article EN International Journal on Semantic Web and Information Systems 2014-04-01

This paper analyses the requirements of a blockchain-based data governance model for COVID-19 digital health certificates. Recognizing gap in existing literature, this aims to answer research question "To what extent does certificates EU meet relevant legal, technical, ethical and security requirements?" identifies required standards develops novel framework determine viability blockchain as model. The results our evaluation indicate that while private permissioned can some degree, element...

10.1016/j.procs.2021.12.303 article EN Procedia Computer Science 2022-01-01

This paper identifies management capabilities for data value chains as a gap in current research.It specifies capability framework and first monitoring maturity model (CMM).This CMM will enable organisations to identify measure the state of their processes, show how take steps enhance order exploit full potential organisation.This new approach is needed since, despite success Big Data appeal data-driven enterprise, there little evidence-based guidance maximising creation.To date, most...

10.5220/0006684805730584 article EN cc-by-nc-nd 2018-01-01

Over the years, paradigm of medical image analysis has shifted from manual expertise to automated systems, often using deep learning (DL) systems. The performance algorithms is highly dependent on data quality. Particularly for domain, it an important aspect as very sensitive quality and poor can lead misdiagnosis. To improve diagnostic performance, research been done both in complex DL architectures improving dataset static hyperparameters. However, still constrained due overfitting...

10.56541/fumf3414 article EN 2022-08-29

Data is central to modern decision making and value creation. Society creates, consumes collects data at an increasing pace. Despite advances in processing power, expensive maintain curate. So, it imperative have methods tools distinguish between based on its value. Yet, there no consensus what characterises the of or how this should be assessed. This results heterogeneous models inconsistent measurement techniques that are siloed specific application domains. limits formalisation...

10.1109/access.2023.3315588 article EN cc-by IEEE Access 2023-01-01

Regardless of which networking protocols or technologies form the core future Internet it is clear that environment as a whole will need to support very broad range business and user interaction modes. In todays we observe growing trend for services be both provided consumed by loosely coupled value networks consumers, providers combined consumer/providers. this paper argue has major implications network management in Internet. particular, discuss six research challenges believe addressed...

10.1109/inmw.2009.5195942 article EN 2009-06-01

This paper describes the development and execution of a data value assessment survey professionals academics.Its purpose was to explore more effective techniques better understand perceived relative importance dimensions for practitioners.This is important because despite current deep interest in value, there lack no clear understanding how individual contribute holistic model value.A total 34 datasets were assessed field study 20 organisations range sectors from finance aviation.It found...

10.5220/0007723402000207 article EN cc-by-nc-nd 2019-01-01

The GDPR requires Data Controllers and Protection Officers (DPO) to maintain a Register of Processing Activities (ROPA) as part overseeing the organisation’s compliance processes. ROPA must include information from heterogeneous sources such (internal) departments with varying IT systems (external) data processors. Current practices use spreadsheets or proprietary that lack machine-readability interoperability, presenting barriers automation. We propose Catalogue (DPCat) for representation,...

10.3390/info13050244 article EN cc-by Information 2022-05-10

In this paper we define DaVe: a data value vocabulary that allows for the comprehensive representation of value.This enables users to extend it using dimensions as required in context at hand.DaVe caters lack consensus on what characterises value, and also how model it.This will allow monitor asses throughout any creating or exploitation efforts, therefore laying basis effective management efficient exploitation.It integration diverse metrics span many which most likely pertain range...

10.5220/0006777701330144 article EN cc-by-nc-nd 2018-01-01
Coming Soon ...