Tamasha Malepathirana

ORCID: 0000-0001-7958-5184
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Domain Adaptation and Few-Shot Learning
  • Generative Adversarial Networks and Image Synthesis
  • Single-cell and spatial transcriptomics
  • Digital Media Forensic Detection
  • COVID-19 diagnosis using AI
  • Advanced Image Processing Techniques
  • Multimodal Machine Learning Applications
  • Advanced Text Analysis Techniques
  • Cell Image Analysis Techniques
  • Software Testing and Debugging Techniques
  • Neural dynamics and brain function
  • Sentiment Analysis and Opinion Mining
  • Text and Document Classification Technologies
  • Face recognition and analysis
  • Stochastic Gradient Optimization Techniques
  • Fault Detection and Control Systems
  • Software System Performance and Reliability
  • Gene expression and cancer classification
  • Advanced Neural Network Applications
  • Data Visualization and Analytics
  • Rough Sets and Fuzzy Logic
  • Explainable Artificial Intelligence (XAI)
  • Web Data Mining and Analysis
  • Facial Nerve Paralysis Treatment and Research
  • Bioinformatics and Genomic Networks

The University of Melbourne
2021-2024

University of Moratuwa
2019

Catastrophic forgetting; the loss of old knowledge upon acquiring new knowledge, is a pitfall faced by deep neural networks in real-world applications. Many prevailing solutions to this problem rely on storing exemplars (previously encountered data), which may not be feasible applications with memory limitations or privacy constraints. Therefore, recent focus has been Non-Exemplar based Class Incremental Learning (NECIL) where model incrementally learns about classes without using any past...

10.1109/iccv51070.2023.01072 article EN 2021 IEEE/CVF International Conference on Computer Vision (ICCV) 2023-10-01

Abstract Longitudinal studies that continuously generate data enable the capture of temporal variations in experimentally observed parameters, facilitating interpretation results a time-aware manner. We propose IL-VIS (incrementally learned visualizer), new machine learning pipeline incrementally learns and visualizes progression trajectory representing longitudinal changes studies. At each sampling time point an experiment, generates snapshot process on thus far, feature is beyond reach...

10.1038/s41598-024-63511-z article EN cc-by Scientific Reports 2024-06-12

The recent renaissance in generative models, driven primarily by the advent of diffusion models and iterative improvement GAN methods, has enabled many creative applications. However, each advancement is also accompanied a rise potential for misuse. In arena deepfake generation, this key societal issue. particular, ability to modify segments videos using such techniques creates new paradigm deepfakes which are mostly real altered slightly distort truth. This been under-explored current...

10.1109/iccvw60793.2023.00048 article EN 2023-10-02

Aspect aggregation or grouping similar aspects is a vital phase in review summarization. Hierarchical aspect needed due to the multi-granular nature of aspects. This paper presents novel approach for hierarchical employing an amalgamation domain specific and independent word embeddings along with agglomerative clustering output structure We evaluate our using both internal external measures. Our results outperform state art aggregation.

10.1109/icosc.2019.8665518 article EN 2019-01-01

Contemporary single-cell technologies produce data with a vast number of variables at rapid pace, making large volumes high-dimensional available. The exploratory analysis such high dimensional can be aided by intuitive low visualizations. In this work, we investigate how both discrete and continuous structures in single cell captured using the recently proposed dimensionality reduction method SONG, compare results commonly used methods UMAP PHATE. Using simulated real-world datasets,...

10.1109/cibcb49929.2021.9562805 article EN 2021-10-13

Neural growth is the process of growing a small neural network to large and has been utilized accelerate training deep networks. One crucial aspect determining optimal timing. However, few studies investigate this systematically. Our study reveals that inherently exhibits regularization effect, whose intensity influenced by chosen policy for While effect may mitigate overfitting risk model, it lead notable accuracy drop when model underfits. Yet, current approaches have not addressed issue...

10.48550/arxiv.2401.03104 preprint EN cc-by arXiv (Cornell University) 2024-01-01

Neural growth is the process of growing a small neural network to large and has been utilized accelerate training deep networks. One crucial aspect determining optimal timing. However, few studies investigate this systematically. Our study reveals that inherently exhibits regularization effect, whose intensity influenced by chosen policy for While effect may mitigate overfitting risk model, it lead notable accuracy drop when model underfits. Yet, current approaches have not addressed issue...

10.1609/aaai.v38i6.28414 article EN Proceedings of the AAAI Conference on Artificial Intelligence 2024-03-24

Face reenactment refers to the process of transferring pose and facial expressions from a reference (driving) video onto static (source) image while maintaining original identity source image. Previous research in this domain has made significant progress by training controllable deep generative models generate faces based on specific identity, expression conditions. However, mechanisms used these methods control often inadvertently introduce information driving video, also causing loss...

10.48550/arxiv.2406.13272 preprint EN arXiv (Cornell University) 2024-06-19

Pruning can be an effective method of compressing large pre-trained models for inference speed acceleration. Previous pruning approaches rely on access to the original training dataset both and subsequent fine-tuning. However, data limited due concerns such as privacy commercial confidentiality. Furthermore, with covariate shift (disparities between test distributions), finetuning datasets hinder generalization pruned model data. To address these issues, time samples becomes essential....

10.48550/arxiv.2412.07114 preprint EN arXiv (Cornell University) 2024-12-09

Artificial Intelligence (AI) and its data-centric branch of machine learning (ML) have greatly evolved over the last few decades. However, as AI is used increasingly in real world use cases, importance interpretability accessibility to systems become major research areas. The lack ML based a hindrance widespread adoption these powerful algorithms. This due many reasons including ethical regulatory concerns, which resulted poorer some recent past has seen surge on interpretable ML. Generally,...

10.4038/jnsfsr.v50i0.11249 article EN cc-by-nd Journal of the National Science Foundation of Sri Lanka 2022-11-10

The recent renaissance in generative models, driven primarily by the advent of diffusion models and iterative improvement GAN methods, has enabled many creative applications. However, each advancement is also accompanied a rise potential for misuse. In arena deepfake generation, this key societal issue. particular, ability to modify segments videos using such techniques creates new paradigm deepfakes which are mostly real altered slightly distort truth. This been under-explored current...

10.48550/arxiv.2305.06564 preprint EN cc-by arXiv (Cornell University) 2023-01-01

Catastrophic forgetting; the loss of old knowledge upon acquiring new knowledge, is a pitfall faced by deep neural networks in real-world applications. Many prevailing solutions to this problem rely on storing exemplars (previously encountered data), which may not be feasible applications with memory limitations or privacy constraints. Therefore, recent focus has been Non-Exemplar based Class Incremental Learning (NECIL) where model incrementally learns about classes without using any past...

10.48550/arxiv.2308.09297 preprint EN cc-by-nc-sa arXiv (Cornell University) 2023-01-01

ABSTRACT Longitudinal studies that continuously generate data enable the capture of temporal variations in experimentally observed parameters, facilitating interpretation results a time-aware manner. We propose IL-VIS (Incrementally Learned Visualizer), new machine learning pipeline incrementally learns and visualizes progression trajectory representing longitudinal changes studies. At each sampling time point an experiment, generates snapshot process on thus far, feature is beyond reach...

10.1101/2022.11.25.515889 preprint EN cc-by-nc-nd bioRxiv (Cold Spring Harbor Laboratory) 2022-11-25
Coming Soon ...