Feature selection using differential evolution for microarray data classification
TK7885-7895
Computer engineering. Computer hardware
QA76.75-76.765
Machine learning
Decision tree
0202 electrical engineering, electronic engineering, information engineering
Computer software
Differential evolution
Classification
Microarray data
Random forest
DOI:
10.1007/s43926-023-00042-5
Publication Date:
2023-10-05T16:01:56Z
AUTHORS (3)
ABSTRACT
AbstractThe dimensions of microarray datasets are very large, containing noise and redundancy. The problem with microarray datasets is the presence of more features compared to the number of samples, which adversely affects algorithm performance. In other words, the number of columns exceeds the number of rows. Therefore, to extract precise information from microarray datasets, a robust technique is required. Microarray datasets play a critical role in detecting various diseases, including cancer and tumors. This is where feature selection techniques come into play. In recent times, feature selection (FS) has gained significant importance as a data preparation method, particularly for high-dimensional data. It is preferable to address classification problems with fewer features while maintaining high accuracy, as not all features are necessary to achieve this goal. The primary objective of feature selection is to identify the optimal subset of features. In this context, we will employ the Differential Evolution (DE) algorithm. DE is a population-based stochastic search approach that has found widespread use in various scientific and technical domains to solve optimization problems in continuous spaces. In our approach, we will combine DE with three different classification algorithms: Random Forest (RF), Decision Tree (DT), and Logistic Regression (LR). Our analysis will include a comparison of the accuracy achieved by each algorithmic model on each dataset, as well as the fitness error for each model. The results indicate that when feature selection was used the results were better compared to the results where the feature selection was not used.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (40)
CITATIONS (3)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....