- Privacy-Preserving Technologies in Data
- Adversarial Robustness in Machine Learning
- Domain Adaptation and Few-Shot Learning
- Antenna Design and Analysis
- Laser-induced spectroscopy and plasma
- Stochastic Gradient Optimization Techniques
- Topic Modeling
- Advanced Neural Network Applications
- Cryptography and Data Security
- Advanced Antenna and Metasurface Technologies
- Energy Harvesting in Wireless Networks
- Geophysical Methods and Applications
- Multimodal Machine Learning Applications
- Conducting polymers and applications
- Advanced Image and Video Retrieval Techniques
- Advanced Sensor and Energy Harvesting Materials
- Spinal Cord Injury Research
- Age of Information Optimization
- Mobile Crowdsensing and Crowdsourcing
- Recommender Systems and Techniques
- Music and Audio Processing
- Explainable Artificial Intelligence (XAI)
- 3D Printing in Biomedical Research
- Access Control and Trust
- Distributed and Parallel Computing Systems
Guangzhou University
2024
King Abdullah University of Science and Technology
2023-2024
Zhejiang Sci-Tech University
2021-2023
Jiangsu Maritime Institute
2023
Hong Kong Polytechnic University
2022-2023
Kootenay Association for Science & Technology
2023
Wuhan University
2023
Renmin Hospital of Wuhan University
2023
Xinjiang University
2023
Beijing Sport University
2023
We explore the problem of selectively forgetting categories from trained CNN classification models in federated learning (FL). Given that data used for training cannot be accessed globally FL, our insights probe deep into internal influence each channel. Through visualization feature maps activated by different channels, we observe channels have a varying contribution to image classification.
Federated Learning (FL) is susceptible to gradient leakage attacks, as recent studies show the feasibility of obtaining private training data on clients from publicly shared gradients. Existing work solves this problem by incorporating a series privacy protection mechanisms, such homomorphic encryption and local differential prevent leakage. However, these solutions either incur significant communication computation costs, or accuracy loss. In paper, we that sensitivity changes w.r.t. an...
Quick global aggregation of effective distributed parameters is crucial to federated learning (FL), which requires adequate bandwidth for communication and sufficient user data local training. Otherwise, FL may cost excessive training time convergence produce inaccurate models. In this paper, we propose a brand-new framework, PromptFL, that replaces the model with prompt training, i.e., let participants train prompts instead shared model, simultaneously achieve efficient on insufficient by...
Multimodal learning (MML) aims to jointly exploit the common priors of different modalities compensate for their inherent limitations. However, existing MML methods often optimize a uniform objective modalities, leading notorious "modality imbalance" problem and counterproductive performance. To address problem, some modulate pace based on fused modality, which is dominated by better modality eventually results in limited improvement worse modal. features multimodal, we propose Prototypical...
Pre-trained vision-language models like CLIP show great potential in learning representations that capture latent characteristics of users. A recently proposed method called Contextual Optimization (CoOp) introduces the concept training prompt for adapting pre-trained models. Given lightweight nature this method, researchers have migrated paradigm from centralized to decentralized system innovate collaborative framework Federated Learning (FL). However, current FL mainly focuses on modeling...
The application of wearable intelligent systems toward human-computer interaction has received widespread attention. It is still desirable to conveniently promote health and monitor sports skills for disabled people. Here, a wireless sensing system (WISS) been developed, which includes two ports flexible triboelectric nanogenerator (WF-TENG) an upper computer digital signal receiving processing. WF-TENG port connected by the sensor printed circuit (FPC). Due its flexibility, can be freely...
The Right to be Forgotten gives a data owner the right revoke their from an entity storing it. In context of federated learning, requires that, in addition itself, any influence on FL model must disappear, process we call "federated unlearning." most straightforward and legitimate way implement unlearning is remove revoked retrain scratch. Yet computational time overhead associated with fully retraining models can prohibitively expensive. this article, take first step comprehensively...
The advent of self-powered arrays tribological nanogenerators (TENGs) that harvest mechanical energy for data collection has ushered in a promising avenue human motion monitoring. This emerging trend is poised to shape the future landscape biomechanical study. However, when we try monitor various regions foot across disparate environments simultaneously, it poses number problems, such as lack satisfactory waterproofing, suboptimal heat resistance, inaccurate monitoring capacity, and...
Federated Learning (FL) has gained considerable attention recently, as it allows clients to cooperatively train a global machine learning model without sharing raw data. However, its performance can be compromised due the high heterogeneity in clients' local data distributions, commonly known Non-IID (non-independent and identically distributed). Moreover, collaboration among highly dissimilar exacerbates this degradation. Personalized FL seeks mitigate by enabling collaborate primarily with...
Recent studies have shown that the training samples can be recovered from gradients, which are called Gradient Inversion (GradInv) attacks. However, there remains a lack of extensive surveys covering recent advances and thorough analysis this issue. In paper, we present comprehensive survey on GradInv, aiming to summarize cutting-edge research broaden horizons for different domains. Firstly, propose taxonomy GradInv attacks by characterizing existing into two paradigms: iteration-...
Quick global aggregation of effective distributed parameters is crucial to federated learning (FL), which requires adequate bandwidth for communication and sufficient user data local training. Otherwise, FL may cost excessive training time convergence produce inaccurate models. In this paper, we propose a brand-new framework, PromptFL, that replaces the model with prompt training, i.e., let participants train prompts instead shared model, simultaneously achieve efficient on insufficient by...
Federated Learning (FL) has emerged as a privacy-preserving paradigm enabling collaborative model training among distributed clients. However, current FL methods operate under the closed-world assumption, i.e., all local data originates from global labeled dataset balanced across classes, which is often invalid for practical scenarios. In contrast, in many open-world settings, have been observed to exhibit heavy-tailed distributions, particularly realm of mobile computing and Internet Things...
Federated Learning (FL) is an emerging paradigm that enables distributed users to collaboratively and iteratively train machine learning models without sharing their private data. Motivated by the effectiveness robustness of self-attention-based architectures, researchers are turning using pre-trained Transformers (i.e., foundation models) instead traditional convolutional neural networks in FL leverage excellent transfer capabilities. Despite recent progress, how Transformer play a role...
The theoretical simulation of orthogonal double pulse by hydrodynamics shows that the signal enhancement reheating mode is better when interval short, but, in contrast, pre-ablation better.
Recent studies have shown that the training samples can be recovered from gradients, which are called Gradient Inversion (GradInv) attacks. However, there remains a lack of extensive surveys covering recent advances and thorough analysis this issue. In paper, we present comprehensive survey on GradInv, aiming to summarize cutting-edge research broaden horizons for different domains. Firstly, propose taxonomy GradInv attacks by characterizing existing into two paradigms: iteration-...
Abstract We proposed a theoretical spatio-temporal imaging method, which was based on the thermal model of laser ablation and two-dimensional axisymmetric multi-species hydrodynamics model. By using intensity formula, integral spectral lines could be calculated corresponding images distribution drawn. Through further image processing such as normalization, determination minimum intensity, combination color filtering, relatively clear species in plasma obtained. Using above we simulated...
The recent success of pre-trained language models (PLMs) such as BERT has resulted in the development various beneficial database middlewares, including natural query interfaces and entity matching. This shift been greatly facilitated by extensive external knowledge PLMs. However, PLMs are often provided untrusted third parties, their lack standardization regulation poses significant security risks that have yet to be fully explored. paper investigates threats posed malicious these emerging...
Major manufacturers and retailers are increasingly using RFID systems in supply-chain scenarios, where theft of goods during transport typically causes significant economic losses for the consumer. This paper studies how to achieve time-efficient secure integrity authentication problems systems. We start with a straightforward solution called SecAuth, which uses identity stored on reserved memory authenticate tags way. then propose time efficient KTAuth protocol, design verification chain...
The large language models (LLMs) that everyone is using are not deployed locally. Users need to send relatively private and important data LLM when it. Handing over will cause people worry, especially now many have begun use deal with life work affairs. Such concerns cannot be easily dispelled by various guarantees agreements. However, LLMs often resource-intensive computationally demanding, making the transition from server-side device-side difficult because LLM's self-attention module...
Using pre-trained vision-language models like CLIP with federated training prompts has shown great potential in learning (FL) by offering significant benefits computation, communication, and privacy over existing frameworks. However, researches overlook the internal mechanisms underlying prompt tuning comply traditional context-unaware mechanism. Our experiments, on other hand, demonstrate that prompting is a data-efficient but data-sensitive paradigm, therefore, samples involved process...