Optimal feature aggregation and combination for two-dimensional ensemble feature selection

Machmud Roby Alhamidi, Wisnu Jatmiko

Research output: Contribution to journalArticlepeer-review

12 Citations (Scopus)


Feature selection is a way of reducing the features of data such that, when the classification algorithm runs, it produces better accuracy. In general, conventional feature selection is quite unstable when faced with changing data characteristics. It would be inefficient to implement individual feature selection in some cases. Ensemble feature selection exists to overcome this problem. However, with the advantages of ensemble feature selection, some issues like stability, threshold, and feature aggregation still need to be overcome. We propose a new framework to deal with stability and feature aggregation. We also used an automatic threshold to see whether it was efficient or not; the results showed that the proposed method always produces the best performance in both accuracy and feature reduction. The accuracy comparison between the proposed method and other methods was 0.5-14% and reduced more features than other methods by 50%. The stability of the proposed method was also excellent, with an average of 0.9. However, when we applied the automatic threshold, there was no beneficial improvement compared to without an automatic threshold. Overall, the proposed method presented excellent performance compared to previous work and standard ReliefF.

Original languageEnglish
Article number38
JournalInformation (Switzerland)
Issue number1
Publication statusPublished - 1 Jan 2020


  • Ensemble feature selection
  • Feature aggregation
  • Stability
  • Threshold


Dive into the research topics of 'Optimal feature aggregation and combination for two-dimensional ensemble feature selection'. Together they form a unique fingerprint.

Cite this