Othmane Marfoq

ORCID: 0000-0002-0542-8925
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Privacy-Preserving Technologies in Data
  • Stochastic Gradient Optimization Techniques
  • Internet Traffic Analysis and Secure E-voting
  • Adversarial Robustness in Machine Learning
  • Cryptography and Data Security
  • Open Education and E-Learning
  • Data-Driven Disease Surveillance
  • Big Data Technologies and Applications
  • Data Stream Mining Techniques
  • Ferroelectric and Negative Capacitance Devices
  • Mobile Crowdsensing and Crowdsourcing
  • Advanced Graph Neural Networks
  • Online Learning and Analytics
  • Access Control and Trust
  • Distributed Sensor Networks and Detection Algorithms

Institut national de recherche en informatique et en automatique
2023

Accenture (Switzerland)
2021-2023

Observatoire de la Côte d’Azur
2021

Université Côte d'Azur
2021

The increasing size of data generated by smartphones and IoT devices motivated the development Federated Learning (FL), a framework for on-device collaborative training machine learning models. First efforts in FL focused on single global model with good average performance across clients, but may be arbitrarily bad given client, due to inherent heterogeneity local distributions. multi-task (MTL) approaches can learn personalized models formulating an opportune penalized optimization...

10.48550/arxiv.2108.10252 preprint EN other-oa arXiv (Cornell University) 2021-01-01

Federated Learning (FL) is a novel approach enabling several clients holding sensitive data to collaboratively train machine learning models, without centralizing data. The cross-silo FL setting corresponds the case of few ($2$--$50$) reliable clients, each medium large datasets, and typically found in applications such as healthcare, finance, or industry. While previous works have proposed representative datasets for cross-device FL, realistic healthcare exist, thereby slowing algorithmic...

10.48550/arxiv.2210.04620 preprint EN other-oa arXiv (Cornell University) 2022-01-01

Federated learning usually employs a client-server architecture where an orchestrator iteratively aggregates model updates from remote clients and pushes them back refined model. This approach may be inefficient in cross-silo settings, as close-by data silos with high-speed access links exchange information faster than the orchestrator, become communication bottleneck. In this paper we define problem of topology design for federated using theory max-plus linear systems to compute system...

10.48550/arxiv.2010.12229 preprint EN other-oa arXiv (Cornell University) 2020-01-01

Federated Learning (FL) enables multiple clients, such as mobile phones and IoT devices, to collaboratively train a global machine learning model while keeping their data localized. However, recent studies have revealed that the training phase of FL is vulnerable reconstruction attacks, attribute inference attacks (AIA), where adversaries exploit exchanged messages auxiliary public information uncover sensitive attributes targeted clients. While these been extensively studied in context...

10.1609/aaai.v39i15.33787 article EN Proceedings of the AAAI Conference on Artificial Intelligence 2025-04-11

The enormous amount of data produced by mobile and IoT devices has motivated the development federated learning (FL), a framework allowing such (or clients) to collaboratively train machine models without sharing their local data. FL algorithms (like FedAvg) iteratively aggregate model updates computed clients on own datasets. Clients may exhibit different levels participation, often correlated over time with other clients. This paper presents first convergence analysis for FedAvg-like...

10.1109/infocom53939.2023.10228876 article EN IEEE INFOCOM 2022 - IEEE Conference on Computer Communications 2023-05-17

Federated learning allows clients to collaboratively learn statistical models while keeping their data local. was originally used train a unique global model be served all clients, but this approach might sub-optimal when clients' local distributions are heterogeneous. In order tackle limitation, recent personalized federated methods separate for each client still leveraging the knowledge available at other clients. work, we exploit ability of deep neural networks extract high quality...

10.48550/arxiv.2111.09360 preprint EN other-oa arXiv (Cornell University) 2021-01-01

The enormous amount of data produced by mobile and IoT devices has motivated the development federated learning (FL), a framework allowing such (or clients) to collaboratively train machine models without sharing their local data.FL algorithms (like FedAvg) iteratively aggregate model updates computed clients on own datasets.Clients may exhibit different levels participation, often correlated over time with other clients.This paper presents first convergence analysis for FedAvg-like FL...

10.1109/tnet.2023.3324257 article EN IEEE/ACM Transactions on Networking 2023-10-23

Within the realm of privacy-preserving machine learning, empirical privacy defenses have been proposed as a solution to achieve satisfactory levels training data without significant drop in model utility. Most existing against membership inference attacks assume access reference data, defined an additional dataset coming from same (or similar) underlying distribution data. Despite common use previous works are notably reticent about defining and evaluating privacy. As gains utility and/or...

10.56553/popets-2024-0031 article EN cc-by Proceedings on Privacy Enhancing Technologies 2023-10-22

Within the realm of privacy-preserving machine learning, empirical privacy defenses have been proposed as a solution to achieve satisfactory levels training data without significant drop in model utility. Most existing against membership inference attacks assume access reference data, defined an additional dataset coming from same (or similar) underlying distribution data. Despite common use previous works are notably reticent about defining and evaluating privacy. As gains utility and/or...

10.48550/arxiv.2310.12112 preprint EN cc-by arXiv (Cornell University) 2023-01-01

The enormous amount of data produced by mobile and IoT devices has motivated the development federated learning (FL), a framework allowing such (or clients) to collaboratively train machine models without sharing their local data. FL algorithms (like FedAvg) iteratively aggregate model updates computed clients on own datasets. Clients may exhibit different levels participation, often correlated over time with other clients. This paper presents first convergence analysis for FedAvg-like...

10.48550/arxiv.2301.04632 preprint EN other-oa arXiv (Cornell University) 2023-01-01

Federated learning (FL) is an effective solution to train machine models on the increasing amount of data generated by IoT devices and smartphones while keeping such localized. Most previous work federated assumes that clients operate static datasets collected before training starts. This approach may be inefficient because 1) it ignores new samples collect during training, 2) require a potentially long preparatory phase for enough data. Moreover, simply impossible in scenarios with small...

10.48550/arxiv.2301.01542 preprint EN other-oa arXiv (Cornell University) 2023-01-01
Coming Soon ...