Research Article
BibTex RIS Cite

Particle swarm optimization based feature selection using factorial design

Year 2024, Volume: 53 Issue: 3, 879 - 896, 27.06.2024
https://doi.org/10.15672/hujms.1346686

Abstract

Feature selection, a common and crucial problem in current scientific research, is a crucial data preprocessing technique and a combinatorial optimization task. Feature selection aims to select a subset of informative and appropriate features from the original feature dataset. Therefore, improving performance on the classification task requires processing the original data using a feature selection strategy before the learning process. Particle swarm optimization, one of the metaheuristic algorithms that prevents the growth of computing complexity, can solve the feature selection problem satisfactorily and quickly with appropriate classification accuracy since it has local optimum escape strategies. There are arbitrary trial and error approaches described separately in the literature to determine the critical binary particle swarm optimization parameters, which are the inertial weight, the transfer function, the threshold value, and the swarm size, that directly affect the performance of the binary particle swarm optimization algorithm parameters used in feature selection. Unlike these approaches, this paper enables us to obtain scientific findings by evaluating all binary particle swarm optimization parameters together with the help of a statistically based factorial design approach. The results show how well the threshold and the transfer function have statistically affected the binary particle swarm optimization algorithm performance.

References

  • [1] B. Alatas, E. Akin and A.B. Ozer, Chaos embedded particle swarm optimization algorithms, Chaos, Solitons Fractals 40 (4), 1715-1734, 2009.
  • [2] B. Alptekin, S. Acitas, B. Senoglu and C.H. Aladag, Statistical determination of significant particle swarm optimization parameters: the case of Weibull distribution, Soft Comput. 26, 12623-12634, 2022.
  • [3] H.M. Alshamlan, G.H. Badr and Y.A. Alohali, Genetic Bee Colony (GBC) algorithm: A new gene selection method for microarray cancer classification, Comput. Biol. Chem. 56, 49-60, 2015.
  • [4] J.C. Bansal and K. Deep, A modified binary particle swarm optimization for Knapsack problems, Appl. Math. Comput. 22, 11042-11061, 2012.
  • [5] R.E. Bellman, Dynamic Programming, Princeton University Press, Princeton, NJ, 1957.
  • [6] R.E. Bellman, Dynamic Programming, Science 153 (3731), 34-37, 1966.
  • [7] B. Bonev, F. Escolano, D. Giorgi and S. Biasotti, Hybrid variable neighborhood search and simulated annealing algorithm to estimate the three parameters of the Weibull distribution, Comput. Vision Image Understanding 117 (3), 214-228, 2013.
  • [8] R. Brits, A.P. Engelbrecht and F. Van Den Bergh, A niching particle swarm optimizer, Proceedings of the Conference on Simulated Evolution and Learning, 692-696, 2002.
  • [9] K. Chen, F.Y. Zhou and X.F. Yuan, Hybrid particle swarm optimization with spiralshaped mechanism for feature selection, Expert Syst. Appl. 128, 140-156, 2019.
  • [10] Y. Chen, J. Liu, J. Zhu and Z.Wang, An improved binary particle swarm optimization combing V-shaped and U-shaped transfer function, Evol. Intell. 16, 1653-1666, 2023.
  • [11] L.Y. Chuang, C.H. Yang and J.C. Li, Chaotic maps based on binary particle swarm optimization for feature selection, Appl. Soft Comput. 11 (1), 239-248, 2011.
  • [12] M. Clerc and J. Kennedy, The particle swarm - explosion, stability, and convergence in a multidimensional complex space, IEEE Trans. Evol. Comput. 6 (1), 58-73, 2002.
  • [13] R.C. Eberhart and Y. Shi, Particle swarm optimization: Developments, applications and resources, Proceedings of the 2001 Congress on Evolutionary Computation 1, 81-86, 2001.
  • [14] A.E. Eiben, R. Hinterding and Z. Michalewicz, Parameter control in evolutionary algorithm, IEEE Trans. Evol. Comput. 3 (2), 124-141, 1999.
  • [15] P.A. Estévez, M. Tesmer, C.A. Perez and J.M. Zurada, Normalized mutual information feature selection, IEEE Trans. Neural Networks 20 (2), 189-201, 2009.
  • [16] A.A. Ewees, M.A. El Aziz and A.E. Hassanien, Chaotic multi-verse optimizer-based feature selection, Neural Comput. Appl. 31, 991-1006, 2019.
  • [17] A.J. Ferreira and M.A.T. Figueiredo, An unsupervised approach to feature discretization and selection, Pattern Recognit. 45 (9), 3048-3060, 2012.
  • [18] R.A. Fisher, The design of experiments, Oliver and Boyd, 1935.
  • [19] Q. Gu, Z. Li and J. Han, Generalized fisher score for feature selection, UAI’11: Proceedings of the Twenty-Seventh Conference on Uncertainty in Artificial Intelligence, 266-273, 2011.
  • [20] S.S. Guo, J.S.Wang and M.W. Guo, Z-Shaped Transfer Functions for Binary Particle Swarm Optimization Algorithm, Comput. Intell. Neurosci. 2020, 6502807, 2020.
  • [21] X. He, D. Cai and P. Niyogi, Laplacian Score for feature selection, NIPS’05: Proceedings of the 18th International Conference on Neural Information Processing Systems 18, 507-514, 2005.
  • [22] A.E. Hegazy, M.A. Makhlouf and G.S. El-Tawel, Feature Selection Using Chaotic Salp Swarm Algorithm for Data Classification, Arabian J. Sci. Eng. 44, 3801-3816, 2019.
  • [23] C.L. Huang and J.F. Dun, A distributed PSOSVM hybrid system with feature selection and parameter optimization, Appl. Soft Comput. 8 (4), 1381-1391, 2008.
  • [24] M.J. Islam, X. Li and Y. Mei, A time-varying transfer function for balancing the exploration and exploitation ability of a binary PSO, Appl. Soft Comput. 59, 182- 196, 2017.
  • [25] I. Jain, V.K. Jain and R. Jain, Correlation feature selection based improved-Binary Particle Swarm Optimization for gene selection and cancer classification, Appl. Soft Comput. 62, 203-215, 2018.
  • [26] H. Jiang, C. Kwong, Z. Chen and Y.C. Ysim, Chaos particle swarm optimization and TS fuzzy modeling approaches to constrained predictive control, Expert Syst. Appl. 39 (1), 194-201, 2012.
  • [27] J. Kennedy and R. Eberhart, Particle swarm optimization, Proceedings of ICNN’95 - International Conference on Neural Networks 4, 1942-1948, 1995.
  • [28] J. Kennedy and R.C. Eberhart, A discrete binary version of the Particle Swarm Optimization, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation 5, 4104-4108, 1997.
  • [29] H. Khosravi, B. Amiri, N. Yazdanjue and V. Babaiyan, An improved group teaching optimization algorithm based on local search and chaotic map for feature selection in high-dimensional data, Expert Syst. Appl. 204, 117493, 2022.
  • [30] M. Labani, P. Moradi, F. Ahmadizar and M. Jalili, A novel multivariate filter method for feature selection in text classification problems, Eng. Appl. Artif. Intell. 70, 25-37, 2018.
  • [31] S. Lee, S. Soak, S. Oh, W. Pedrycz and M. Jeon, Modified binary particle swarm optimization, Prog. Nat. Sci. 18 (9), 1161-1166, 2008.
  • [32] M. Mafarja, I. Aljarah, A.A. Heidari, H. Faris, P. Fournier-Viger, X. Li and S. Mirjalili, Binary dragonfly optimization for feature selection using time-varying transfer functions, Knowledge-Based Syst. 161, 185-204, 2018.
  • [33] S. Mirjalili and A. Lewis, S-shaped versus V-shaped transfer functions for binary Particle Swarm Optimization, Swarm Evol. Comput. 9, 1-14, 2013.
  • [34] S. Mirjalili, H. Zhang, S. Mirjalili, S. Chalup and N. Nomani, A Novel U-Shaped Transfer Function for Binary Particle Swarm Optimisation, In Soft Computing for Problem Solving 2019: Proceedings of SocProS 2019 1, 241-259, 2020.
  • [35] D.C. Montgomery, Design and analysis of experiments, Wiley, New Jersey, 2013.
  • [36] P. Moradi and M. Gholampour, A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy, Appl. Soft Comput. 43, 117-130, 2016.
  • [37] P. Moradi and M. Gholampour, A novel particle swarm optimization algorithm with adaptive inertia weight, Appl. Soft Comput. 11 (4), 3658-3670, 2011.
  • [38] H. Peng, F. Long and C. Ding, Feature selection based on mutual information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy, IEEE Trans. Pattern Anal. Mach. Intell. 27 (8), 1226-1238, 2005.
  • [39] Y. Prasad, K.K. Biswas and M. Hanmandlu, A recursive PSO scheme for gene selection in microarray data, Appl. Soft Comput. 71, 213-225, 2018.
  • [40] O.S. Qasim and Z.Y. Algamal, Feature selection using particle swarm optimizationbased logistic regression model, Chemom. Intell. Lab. Syst. 182, 41-46, 2018.
  • [41] O.S. Qasim, N.A. Al-Thanoon and Z.Y. Algamal, Feature selection based on chaotic binary black hole algorithm for data classification, Chemom. Intell. Lab. Syst. 204, 104104, 2020.
  • [42] L.E. Raileanu and K. Stoffel, Theoretical comparison between the Gini Index and Information Gain criteria, Ann. Math. Artif. Intell. 41, 77-93, 2004.
  • [43] E. Rashedi, H. Nezamabadi-Pour and S. Saryazdi, A simultaneous feature adaptation and feature selection method for content-based image retrieval systems, Knowledge- Based Syst. 39, 85-94, 2013.
  • [44] M. Rostami, S. Forouzandeh, K. Berahmand and M. Soltani, Integration of multiobjective PSO based feature selection and node centrality for medical datasets, Genomics 112 (6), 4370-4384, 2020.
  • [45] M. Rostami, K. Berahmand, E. Nasiri and S. Forouzande, Review of swarm intelligence-based feature selection methods, Eng. Appl. Artif. Intell. 100, 104210, 2021.
  • [46] S. Saremi, S. Mirjalili and A. Lewis, Biogeography-based optimisation with chaos, Neural Comput. Appl. 25, 1077-1097, 2014.
  • [47] G.I. Sayed, A. Darwish and A.E. Hassanien, A New Chaotic Whale Optimization Algorithm for Features Selection, J. Classif. 35, 300-344, 2018.
  • [48] G.I. Sayed, G. Khoriba and M.H. Haggag, A novel chaotic salp swarm algorithm for global optimization and feature selection, Applied Intelligence 48, 3462-3481, 2018.
  • [49] G.I. Sayed, A. Tharwat and A.E. Hassanien, Chaotic dragonfly algorithm: an improved metaheuristic algorithm for feature selection, Appl. Intell. 49, 188-205, 2019.
  • [50] M. Shafipour, A. Rashno and S. Fadaeii, Particle distance rank feature selection by particle swarm optimization, Expert Syst. Appl. 185, 115620, 2021.
  • [51] Y. Shi and R. Eberhart, A modified particle swarm optimizer, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence, 69-73, 1998.
  • [52] U. Singh and S.N. Singh, A new optimal feature selection scheme for classification of power quality disturbances based on ant colony framework, Appl. Soft Comput. 74, 216-225, 2019.
  • [53] M. Tahir, A. Tubaishat, F. Al-Obeidat, B. Shah, Z. Halim and M. Waqas, A novel binary chaotic genetic algorithm for feature selection and its utility in affective computing and healthcare, Neural Comput. Appl. 34, 11453-11474, 2022.
  • [54] X. Tang, Y. Dai and Y. Xiang, Feature selection based on feature interactions with application to text categorization, Expert Syst. Appl. 120, 207-216, 2019.
  • [55] A. Tharwat and A.E. Hassanien, Chaotic antlion algorithm for parameter optimization of support vector machine, Appl. Intell. 48, 670-686, 2018.
  • [56] T.O. Ting, M.V.C. Rao and C.K. Loo, A novel approach for unit commitment problem via an effective hybrid particle swarm optimization, IEEE Trans. Power Syst. 21 (1), 411-418, 2006.
  • [57] J. Too and A.R. Abdullah, Chaotic Atom Search Optimization for Feature Selection, Arabian J. Sci. Eng. 45, 6063-6079, 2020.
  • [58] A. M. Unler and R.B. Chinnam, mr2PSO: A maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification, Inf. Sci. 181 (20), 4625-4641, 2011.
  • [59] F. Van Den Bergh and A.P. Engelbrecht, Effects of swarm size on cooperative particle swarm optimisers, GECCO’01: Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation, 892-899, 2001.
  • [60] V.N. Vapnik, The Nature of Statistical Learning Theory, Springer, 1995.
  • [61] L. Wang, X. Wang, J. Fu and L. Zhen, A novel probability binary particle swarm optimization algorithm and its application, J. Softw. 3 (9), 28-35, 2008.
  • [62] H. Wang, Y. Zhang, J. Zhang, T. Li and L. Pengg, A factor graph model for unsupervised feature selection, Inf. Sci. 480, 144-159, 2019.
  • [63] R.A. Welikala, M.M. Fraz, J. Dehmeshki, A. Hoppe, V. Tah, S. Mann, T.H. Williamson and S.A. Barman, Genetic algorithm based feature selection combined with dual classification for the automated detection of proliferative diabetic retinopathy, Comput. Med. Imaging Graphics 43, 64-77, 2015.
  • [64] Y. Xu, G. Jones, J. Li, B. Wang and C. Sun, Study on mutual information-based feature selection for text categorization, J.Comput. Inf. Syst. 3 (3), 1007-1012, 2007.
  • [65] X. Xu, H. Rong, M. Trovati, M. Liptrott and N. Bessis, CS-PSO: chaotic particle swarm optimization algorithm for solving combinatorial optimization problems, Soft Comput. 22, 783-795, 2018.
  • [66] B. Xue, M. Zhang and W.N. Browne, Particle swarm optimization for feature selection in classification: A multi-objective approach, IEEE Trans. Cybern. 43 (6), 1656-1671, 2013.
  • [67] B. Xue, M. Zhang and W.N. Browne, Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms, Appl. Soft Comput. 18, 261-276, 2014.
  • [68] B. Xue, M. Zhang, W.N. Browne and X. Yao, A Survey on Evolutionary Computation Approaches to Feature Selection, IEEE Trans. Evol. Comput. 20 (4), 606-626, 2016.
  • [69] Y. Xue, T. Tang, W. Pang and A.X. Liu, Self-adaptive parameter and strategy based particle swarm optimization for large-scale feature selection problems with multiple classifiers, Appl. Soft Comput. 88, 106031, 2020.
  • [70] M. Yamada, W. Jitkrittum, L. Sigal, E.P. Xing and M. Sugiyama, High-Dimensional feature selection by feature-Wise kernelized lasso, Neural Comput. 26 (1), 185-207, 2014.
  • [71] F. Yates, The design and analysis of factorial experiments, Imperial Bureau of Soil Science, 1937.
Year 2024, Volume: 53 Issue: 3, 879 - 896, 27.06.2024
https://doi.org/10.15672/hujms.1346686

Abstract

References

  • [1] B. Alatas, E. Akin and A.B. Ozer, Chaos embedded particle swarm optimization algorithms, Chaos, Solitons Fractals 40 (4), 1715-1734, 2009.
  • [2] B. Alptekin, S. Acitas, B. Senoglu and C.H. Aladag, Statistical determination of significant particle swarm optimization parameters: the case of Weibull distribution, Soft Comput. 26, 12623-12634, 2022.
  • [3] H.M. Alshamlan, G.H. Badr and Y.A. Alohali, Genetic Bee Colony (GBC) algorithm: A new gene selection method for microarray cancer classification, Comput. Biol. Chem. 56, 49-60, 2015.
  • [4] J.C. Bansal and K. Deep, A modified binary particle swarm optimization for Knapsack problems, Appl. Math. Comput. 22, 11042-11061, 2012.
  • [5] R.E. Bellman, Dynamic Programming, Princeton University Press, Princeton, NJ, 1957.
  • [6] R.E. Bellman, Dynamic Programming, Science 153 (3731), 34-37, 1966.
  • [7] B. Bonev, F. Escolano, D. Giorgi and S. Biasotti, Hybrid variable neighborhood search and simulated annealing algorithm to estimate the three parameters of the Weibull distribution, Comput. Vision Image Understanding 117 (3), 214-228, 2013.
  • [8] R. Brits, A.P. Engelbrecht and F. Van Den Bergh, A niching particle swarm optimizer, Proceedings of the Conference on Simulated Evolution and Learning, 692-696, 2002.
  • [9] K. Chen, F.Y. Zhou and X.F. Yuan, Hybrid particle swarm optimization with spiralshaped mechanism for feature selection, Expert Syst. Appl. 128, 140-156, 2019.
  • [10] Y. Chen, J. Liu, J. Zhu and Z.Wang, An improved binary particle swarm optimization combing V-shaped and U-shaped transfer function, Evol. Intell. 16, 1653-1666, 2023.
  • [11] L.Y. Chuang, C.H. Yang and J.C. Li, Chaotic maps based on binary particle swarm optimization for feature selection, Appl. Soft Comput. 11 (1), 239-248, 2011.
  • [12] M. Clerc and J. Kennedy, The particle swarm - explosion, stability, and convergence in a multidimensional complex space, IEEE Trans. Evol. Comput. 6 (1), 58-73, 2002.
  • [13] R.C. Eberhart and Y. Shi, Particle swarm optimization: Developments, applications and resources, Proceedings of the 2001 Congress on Evolutionary Computation 1, 81-86, 2001.
  • [14] A.E. Eiben, R. Hinterding and Z. Michalewicz, Parameter control in evolutionary algorithm, IEEE Trans. Evol. Comput. 3 (2), 124-141, 1999.
  • [15] P.A. Estévez, M. Tesmer, C.A. Perez and J.M. Zurada, Normalized mutual information feature selection, IEEE Trans. Neural Networks 20 (2), 189-201, 2009.
  • [16] A.A. Ewees, M.A. El Aziz and A.E. Hassanien, Chaotic multi-verse optimizer-based feature selection, Neural Comput. Appl. 31, 991-1006, 2019.
  • [17] A.J. Ferreira and M.A.T. Figueiredo, An unsupervised approach to feature discretization and selection, Pattern Recognit. 45 (9), 3048-3060, 2012.
  • [18] R.A. Fisher, The design of experiments, Oliver and Boyd, 1935.
  • [19] Q. Gu, Z. Li and J. Han, Generalized fisher score for feature selection, UAI’11: Proceedings of the Twenty-Seventh Conference on Uncertainty in Artificial Intelligence, 266-273, 2011.
  • [20] S.S. Guo, J.S.Wang and M.W. Guo, Z-Shaped Transfer Functions for Binary Particle Swarm Optimization Algorithm, Comput. Intell. Neurosci. 2020, 6502807, 2020.
  • [21] X. He, D. Cai and P. Niyogi, Laplacian Score for feature selection, NIPS’05: Proceedings of the 18th International Conference on Neural Information Processing Systems 18, 507-514, 2005.
  • [22] A.E. Hegazy, M.A. Makhlouf and G.S. El-Tawel, Feature Selection Using Chaotic Salp Swarm Algorithm for Data Classification, Arabian J. Sci. Eng. 44, 3801-3816, 2019.
  • [23] C.L. Huang and J.F. Dun, A distributed PSOSVM hybrid system with feature selection and parameter optimization, Appl. Soft Comput. 8 (4), 1381-1391, 2008.
  • [24] M.J. Islam, X. Li and Y. Mei, A time-varying transfer function for balancing the exploration and exploitation ability of a binary PSO, Appl. Soft Comput. 59, 182- 196, 2017.
  • [25] I. Jain, V.K. Jain and R. Jain, Correlation feature selection based improved-Binary Particle Swarm Optimization for gene selection and cancer classification, Appl. Soft Comput. 62, 203-215, 2018.
  • [26] H. Jiang, C. Kwong, Z. Chen and Y.C. Ysim, Chaos particle swarm optimization and TS fuzzy modeling approaches to constrained predictive control, Expert Syst. Appl. 39 (1), 194-201, 2012.
  • [27] J. Kennedy and R. Eberhart, Particle swarm optimization, Proceedings of ICNN’95 - International Conference on Neural Networks 4, 1942-1948, 1995.
  • [28] J. Kennedy and R.C. Eberhart, A discrete binary version of the Particle Swarm Optimization, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation 5, 4104-4108, 1997.
  • [29] H. Khosravi, B. Amiri, N. Yazdanjue and V. Babaiyan, An improved group teaching optimization algorithm based on local search and chaotic map for feature selection in high-dimensional data, Expert Syst. Appl. 204, 117493, 2022.
  • [30] M. Labani, P. Moradi, F. Ahmadizar and M. Jalili, A novel multivariate filter method for feature selection in text classification problems, Eng. Appl. Artif. Intell. 70, 25-37, 2018.
  • [31] S. Lee, S. Soak, S. Oh, W. Pedrycz and M. Jeon, Modified binary particle swarm optimization, Prog. Nat. Sci. 18 (9), 1161-1166, 2008.
  • [32] M. Mafarja, I. Aljarah, A.A. Heidari, H. Faris, P. Fournier-Viger, X. Li and S. Mirjalili, Binary dragonfly optimization for feature selection using time-varying transfer functions, Knowledge-Based Syst. 161, 185-204, 2018.
  • [33] S. Mirjalili and A. Lewis, S-shaped versus V-shaped transfer functions for binary Particle Swarm Optimization, Swarm Evol. Comput. 9, 1-14, 2013.
  • [34] S. Mirjalili, H. Zhang, S. Mirjalili, S. Chalup and N. Nomani, A Novel U-Shaped Transfer Function for Binary Particle Swarm Optimisation, In Soft Computing for Problem Solving 2019: Proceedings of SocProS 2019 1, 241-259, 2020.
  • [35] D.C. Montgomery, Design and analysis of experiments, Wiley, New Jersey, 2013.
  • [36] P. Moradi and M. Gholampour, A hybrid particle swarm optimization for feature subset selection by integrating a novel local search strategy, Appl. Soft Comput. 43, 117-130, 2016.
  • [37] P. Moradi and M. Gholampour, A novel particle swarm optimization algorithm with adaptive inertia weight, Appl. Soft Comput. 11 (4), 3658-3670, 2011.
  • [38] H. Peng, F. Long and C. Ding, Feature selection based on mutual information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy, IEEE Trans. Pattern Anal. Mach. Intell. 27 (8), 1226-1238, 2005.
  • [39] Y. Prasad, K.K. Biswas and M. Hanmandlu, A recursive PSO scheme for gene selection in microarray data, Appl. Soft Comput. 71, 213-225, 2018.
  • [40] O.S. Qasim and Z.Y. Algamal, Feature selection using particle swarm optimizationbased logistic regression model, Chemom. Intell. Lab. Syst. 182, 41-46, 2018.
  • [41] O.S. Qasim, N.A. Al-Thanoon and Z.Y. Algamal, Feature selection based on chaotic binary black hole algorithm for data classification, Chemom. Intell. Lab. Syst. 204, 104104, 2020.
  • [42] L.E. Raileanu and K. Stoffel, Theoretical comparison between the Gini Index and Information Gain criteria, Ann. Math. Artif. Intell. 41, 77-93, 2004.
  • [43] E. Rashedi, H. Nezamabadi-Pour and S. Saryazdi, A simultaneous feature adaptation and feature selection method for content-based image retrieval systems, Knowledge- Based Syst. 39, 85-94, 2013.
  • [44] M. Rostami, S. Forouzandeh, K. Berahmand and M. Soltani, Integration of multiobjective PSO based feature selection and node centrality for medical datasets, Genomics 112 (6), 4370-4384, 2020.
  • [45] M. Rostami, K. Berahmand, E. Nasiri and S. Forouzande, Review of swarm intelligence-based feature selection methods, Eng. Appl. Artif. Intell. 100, 104210, 2021.
  • [46] S. Saremi, S. Mirjalili and A. Lewis, Biogeography-based optimisation with chaos, Neural Comput. Appl. 25, 1077-1097, 2014.
  • [47] G.I. Sayed, A. Darwish and A.E. Hassanien, A New Chaotic Whale Optimization Algorithm for Features Selection, J. Classif. 35, 300-344, 2018.
  • [48] G.I. Sayed, G. Khoriba and M.H. Haggag, A novel chaotic salp swarm algorithm for global optimization and feature selection, Applied Intelligence 48, 3462-3481, 2018.
  • [49] G.I. Sayed, A. Tharwat and A.E. Hassanien, Chaotic dragonfly algorithm: an improved metaheuristic algorithm for feature selection, Appl. Intell. 49, 188-205, 2019.
  • [50] M. Shafipour, A. Rashno and S. Fadaeii, Particle distance rank feature selection by particle swarm optimization, Expert Syst. Appl. 185, 115620, 2021.
  • [51] Y. Shi and R. Eberhart, A modified particle swarm optimizer, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence, 69-73, 1998.
  • [52] U. Singh and S.N. Singh, A new optimal feature selection scheme for classification of power quality disturbances based on ant colony framework, Appl. Soft Comput. 74, 216-225, 2019.
  • [53] M. Tahir, A. Tubaishat, F. Al-Obeidat, B. Shah, Z. Halim and M. Waqas, A novel binary chaotic genetic algorithm for feature selection and its utility in affective computing and healthcare, Neural Comput. Appl. 34, 11453-11474, 2022.
  • [54] X. Tang, Y. Dai and Y. Xiang, Feature selection based on feature interactions with application to text categorization, Expert Syst. Appl. 120, 207-216, 2019.
  • [55] A. Tharwat and A.E. Hassanien, Chaotic antlion algorithm for parameter optimization of support vector machine, Appl. Intell. 48, 670-686, 2018.
  • [56] T.O. Ting, M.V.C. Rao and C.K. Loo, A novel approach for unit commitment problem via an effective hybrid particle swarm optimization, IEEE Trans. Power Syst. 21 (1), 411-418, 2006.
  • [57] J. Too and A.R. Abdullah, Chaotic Atom Search Optimization for Feature Selection, Arabian J. Sci. Eng. 45, 6063-6079, 2020.
  • [58] A. M. Unler and R.B. Chinnam, mr2PSO: A maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification, Inf. Sci. 181 (20), 4625-4641, 2011.
  • [59] F. Van Den Bergh and A.P. Engelbrecht, Effects of swarm size on cooperative particle swarm optimisers, GECCO’01: Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation, 892-899, 2001.
  • [60] V.N. Vapnik, The Nature of Statistical Learning Theory, Springer, 1995.
  • [61] L. Wang, X. Wang, J. Fu and L. Zhen, A novel probability binary particle swarm optimization algorithm and its application, J. Softw. 3 (9), 28-35, 2008.
  • [62] H. Wang, Y. Zhang, J. Zhang, T. Li and L. Pengg, A factor graph model for unsupervised feature selection, Inf. Sci. 480, 144-159, 2019.
  • [63] R.A. Welikala, M.M. Fraz, J. Dehmeshki, A. Hoppe, V. Tah, S. Mann, T.H. Williamson and S.A. Barman, Genetic algorithm based feature selection combined with dual classification for the automated detection of proliferative diabetic retinopathy, Comput. Med. Imaging Graphics 43, 64-77, 2015.
  • [64] Y. Xu, G. Jones, J. Li, B. Wang and C. Sun, Study on mutual information-based feature selection for text categorization, J.Comput. Inf. Syst. 3 (3), 1007-1012, 2007.
  • [65] X. Xu, H. Rong, M. Trovati, M. Liptrott and N. Bessis, CS-PSO: chaotic particle swarm optimization algorithm for solving combinatorial optimization problems, Soft Comput. 22, 783-795, 2018.
  • [66] B. Xue, M. Zhang and W.N. Browne, Particle swarm optimization for feature selection in classification: A multi-objective approach, IEEE Trans. Cybern. 43 (6), 1656-1671, 2013.
  • [67] B. Xue, M. Zhang and W.N. Browne, Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms, Appl. Soft Comput. 18, 261-276, 2014.
  • [68] B. Xue, M. Zhang, W.N. Browne and X. Yao, A Survey on Evolutionary Computation Approaches to Feature Selection, IEEE Trans. Evol. Comput. 20 (4), 606-626, 2016.
  • [69] Y. Xue, T. Tang, W. Pang and A.X. Liu, Self-adaptive parameter and strategy based particle swarm optimization for large-scale feature selection problems with multiple classifiers, Appl. Soft Comput. 88, 106031, 2020.
  • [70] M. Yamada, W. Jitkrittum, L. Sigal, E.P. Xing and M. Sugiyama, High-Dimensional feature selection by feature-Wise kernelized lasso, Neural Comput. 26 (1), 185-207, 2014.
  • [71] F. Yates, The design and analysis of factorial experiments, Imperial Bureau of Soil Science, 1937.
There are 71 citations in total.

Details

Primary Language English
Subjects Statistical Experiment Design, Statistical Data Science, Operation, Statistics (Other)
Journal Section Statistics
Authors

Emre Koçak 0000-0001-6686-9671

H. Hasan Örkcü 0000-0002-2888-9580

Early Pub Date June 11, 2024
Publication Date June 27, 2024
Published in Issue Year 2024 Volume: 53 Issue: 3

Cite

APA Koçak, E., & Örkcü, H. H. (2024). Particle swarm optimization based feature selection using factorial design. Hacettepe Journal of Mathematics and Statistics, 53(3), 879-896. https://doi.org/10.15672/hujms.1346686
AMA Koçak E, Örkcü HH. Particle swarm optimization based feature selection using factorial design. Hacettepe Journal of Mathematics and Statistics. June 2024;53(3):879-896. doi:10.15672/hujms.1346686
Chicago Koçak, Emre, and H. Hasan Örkcü. “Particle Swarm Optimization Based Feature Selection Using Factorial Design”. Hacettepe Journal of Mathematics and Statistics 53, no. 3 (June 2024): 879-96. https://doi.org/10.15672/hujms.1346686.
EndNote Koçak E, Örkcü HH (June 1, 2024) Particle swarm optimization based feature selection using factorial design. Hacettepe Journal of Mathematics and Statistics 53 3 879–896.
IEEE E. Koçak and H. H. Örkcü, “Particle swarm optimization based feature selection using factorial design”, Hacettepe Journal of Mathematics and Statistics, vol. 53, no. 3, pp. 879–896, 2024, doi: 10.15672/hujms.1346686.
ISNAD Koçak, Emre - Örkcü, H. Hasan. “Particle Swarm Optimization Based Feature Selection Using Factorial Design”. Hacettepe Journal of Mathematics and Statistics 53/3 (June 2024), 879-896. https://doi.org/10.15672/hujms.1346686.
JAMA Koçak E, Örkcü HH. Particle swarm optimization based feature selection using factorial design. Hacettepe Journal of Mathematics and Statistics. 2024;53:879–896.
MLA Koçak, Emre and H. Hasan Örkcü. “Particle Swarm Optimization Based Feature Selection Using Factorial Design”. Hacettepe Journal of Mathematics and Statistics, vol. 53, no. 3, 2024, pp. 879-96, doi:10.15672/hujms.1346686.
Vancouver Koçak E, Örkcü HH. Particle swarm optimization based feature selection using factorial design. Hacettepe Journal of Mathematics and Statistics. 2024;53(3):879-96.