Axel Oehmichen

ORCID: 0000-0003-0224-5709
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Scientific Computing and Data Management
  • Misinformation and Its Impacts
  • Gene expression and cancer classification
  • European and International Law Studies
  • Bioinformatics and Genomic Networks
  • Anomaly Detection Techniques and Applications
  • Social Media and Politics
  • Research Data Management Practices
  • Media Influence and Politics
  • Advanced Neural Network Applications
  • Spam and Phishing Detection
  • Explainable Artificial Intelligence (XAI)
  • Opinion Dynamics and Social Influence
  • Complex Network Analysis Techniques
  • Human Mobility and Location-Based Analysis
  • Data Visualization and Analytics
  • Computational Physics and Python Applications
  • COVID-19 diagnosis using AI
  • Artificial Intelligence in Healthcare
  • Distributed and Parallel Computing Systems
  • Generative Adversarial Networks and Image Synthesis
  • COVID-19 epidemiological studies
  • COVID-19 Digital Contact Tracing
  • Mobile Crowdsensing and Crowdsourcing
  • Hong Kong and Taiwan Politics

Imperial College London
2014-2019

Deep learning has enabled major advances in the fields of computer vision, natural language processing, and multimedia among many others. Developing a deep system is arduous complex, as it involves constructing neural network architectures, managing training/trained models, tuning optimization process, preprocessing organizing data, etc. TensorLayer versatile Python library that aims at helping researchers engineers efficiently develop systems. It offers rich abstractions for networks, model...

10.1145/3123266.3129391 preprint EN Proceedings of the 30th ACM International Conference on Multimedia 2017-10-19

Knowledge graphs are an increasingly important source of data and context information in Data Science. A first step analysis is exploration, which visualization plays a key role. Currently, Semantic Web technologies prevalent for modeling querying knowledge graphs; however, most approaches this area tend to be overly simplified targeted small-sized representations. In work, we describe evaluate the performance Big architecture applied large-scale graph visualization. To do so, have...

10.1016/j.future.2018.06.015 article EN cc-by Future Generation Computer Systems 2018-06-30

High-throughput molecular profiling data has been used to improve clinical decision making by stratifying subjects based on their profiles. Unsupervised clustering algorithms can be for stratification purposes. However, the current speed of cannot meet requirement large-scale due poor performance correlation matrix calculation. With high-throughput sequencing technologies promising produce even larger datasets per subject, we expect state-of-the-art statistical further impacted unless...

10.1186/s12859-014-0351-9 article EN cc-by BMC Bioinformatics 2014-11-04

We investigated whether and how political misinformation is engineered using a dataset of four months worth tweets related to the 2016 presidential election in United States. The data contained that achieved significant level exposure was manually labelled into regular information. found produced by accounts exhibit different characteristics behaviour from accounts. Moreover, content more novel, polarised appears change through coordination. Our findings suggest engineering seems exploit...

10.1109/access.2019.2938389 article EN cc-by IEEE Access 2019-01-01

Abstract Data is the foundation of any scientific, industrial, or commercial process. Its journey flows from collection to transport, storage, and processing. While best practices regulations guide its management protection, recent events have underscored their vulnerabilities. Academic research data handling been marred by scandals, revealing brittleness management. susceptible undue disclosures, leaks, losses, manipulation, fabrication. These incidents often occur without visibility...

10.1017/dap.2024.68 article EN cc-by-nc-nd Data & Policy 2024-01-01

Mobile phones and other ubiquitous technologies are generating vast amounts of high-resolution location data. This data has been shown to have a great potential for the public good, e.g. monitor human migration during crises or predict spread epidemic diseases. Location is, however, considered one most sensitive types data, large body research limits traditional anonymization methods big Privacy concerns so far strongly limited use collected by telcos, especially in developing countries.In...

10.1109/bigdata47090.2019.9006389 article EN 2021 IEEE International Conference on Big Data (Big Data) 2019-12-01

Translational research is quickly becoming a science driven by big data. Improving patient care, developing personalized therapies and new drugs depend increasingly on an organization's ability to rapidly intelligently leverage complex molecular clinical data from variety of large-scale partner public sources. As analysing these datasets becomes computationally expensive, traditional analytical engines are struggling provide timely answer the questions that biomedical scientists asking....

10.1109/bigdata.2017.8257945 article EN 2021 IEEE International Conference on Big Data (Big Data) 2017-12-01

This article presents a preliminary approach towards characterizing political fake news on Twitter through the analysis of their meta-data. In particular, we focus more than 1.5M tweets collected day election Donald Trump as 45th president United States America. We use meta-data embedded within those in order to look for differences between containing and not them. Specifically, perform our only that went viral, by studying proxies users' exposure tweets, accounts spreading news, looking at...

10.48550/arxiv.1712.05999 preprint EN other-oa arXiv (Cornell University) 2017-01-01

Data is the foundation of any scientific, industrial or commercial process. Its journey typically flows from collection to transport, storage, management and processing. While best practices regulations guide data protection, recent events have underscored its vulnerability. Academic research handling been marred by scandals, revealing brittleness management. Data, despite importance, susceptible undue disclosures, leaks, losses, manipulation, fabrication. These incidents often occur without...

10.48550/arxiv.2407.14390 preprint EN arXiv (Cornell University) 2024-07-19

Translational biomedical research has become a science driven by big data. Improving patient care developing personalized therapies and new drugs depends increasingly on an organization's ability to rapidly intelligently leverage complex molecular clinical data from variety of large-scale internal external, partner public, sources. As analysing these datasets computationally expensive, it is paramount importance enable researchers seamlessly scale up their computation platform while being...

10.1109/icdcs.2018.00167 article EN 2018-07-01

Epidemiology models play a key role in understanding and responding to the COVID-19 pandemic. In order build those models, scientists need understand contributing factors their relative importance. A large strand of literature has identified importance airflow mitigate droplets far-field aerosol transmission risks. However, specific higher or lower contamination various settings have not been clearly defined quantified. As part MOAI project (https://moaiapp.com), we are developing...

10.48550/arxiv.2103.17096 preprint EN cc-by-nc-sa arXiv (Cornell University) 2021-01-01
Coming Soon ...