1
- # Jx-WFST : A Wrapper Feature Selection Toolbox
1
+ # A Newer Wrapper Feature Selection Toolbox improved by Jx-WFST.
2
2
3
- [ ![ View Wrapper Feature Selection Toolbox on File Exchange] ( https://www.mathworks.com/matlabcentral/images/matlab-file-exchange.svg )] ( https://www .mathworks.com /matlabcentral/fileexchange/84139 -wrapper-feature-selection-toolbox )
4
- [ ![ License] ( https://img.shields.io/badge/license-BSD_3-yellow.svg )] ( https://github.com/JingweiToo/ Wrapper-Feature-Selection-Toolbox/blob/main/LICENSE )
5
- [ ![ GitHub release] ( https://img.shields.io/badge/release-1.1.1 -green.svg )] ( https://github.com/JingweiToo/ Wrapper-Feature-Selection-Toolbox )
3
+ [ ![ View Wrapper Feature Selection Toolbox on File Exchange] ( https://www.mathworks.com/matlabcentral/images/matlab-file-exchange.svg )] ( https://ww2 .mathworks.cn /matlabcentral/fileexchange/178129-newer -wrapper-feature-selection-toolbox )
4
+ [ ![ License] ( https://img.shields.io/badge/license-BSD_3-yellow.svg )] ( https://github.com/Ruiruiz30/Newer- Wrapper-Feature-Selection-Toolbox/blob/main/LICENSE )
5
+ [ ![ GitHub release] ( https://img.shields.io/badge/release-2.0.0 -green.svg )] ( https://github.com/Ruiruiz30/Newer- Wrapper-Feature-Selection-Toolbox/blob/main/README.md )
6
6
7
7
---
8
8
> "Toward Talent Scientist: Sharing and Learning Together"
9
9
> --- [ Jingwei Too] ( https://jingweitoo.wordpress.com/ )
10
10
---
11
-
11
+ ---
12
+ > The purpose of scientific research is for development, not SCI.
13
+ > --- [ Jinrui Zhang] ( )
14
+ ---
12
15
![ Wheel] ( https://www.mathworks.com/matlabcentral/mlc-downloads/downloads/5dc2bdb4-ce4b-4e0e-bd6e-0237ff6ddde1/f9a9e760-64b9-4e31-9903-dffcabdf8be6/images/1607601518.JPG )
13
16
14
17
15
18
## Introduction
16
19
17
20
* This toolbox offers more than 40 wrapper feature selection methods
18
- * The ` A_Main ` file provides the examples of how to apply these methods on benchmark dataset
21
+ * In this toolbox, you can run multiple algorithms together and compare their performance
22
+ * This toolbox adds the calculation of F-measure and Precision on v1.0
19
23
* Source code of these methods are written based on pseudocode & paper
20
24
21
25
@@ -33,7 +37,14 @@ FS = jfs('pso',feat,label,opts);
33
37
``` code
34
38
FS = jfs('sma',feat,label,opts);
35
39
```
36
-
40
+ * ` jfs ` contains the names of all the algorithms available in the toolbox and points to the corresponding functions
41
+ * You can specify which algorithms you want to run at the same time in ` get_Algorithm `
42
+ * You can specify the dataset you want to process in ` get_Dataset `
43
+ * With ` A_get_one_data ` , you can use one algorithm to perform Feature selection on multiple datasets and visualize data such as Accuracy, Feature size, etc.
44
+ * Using ` A_get_all_data ` , you can run multiple algorithms simultaneously for feature selection on multiple datasets and compute Fitness, Accuracy, F-measure, Precision, etc., along with Friedman ranking
45
+ * Using ` A_Fitness_Wilcoxon ` , you can perform the wilcoxon rank-sum test on multiple algorithms
46
+ * The parameters in the code are commented
47
+
37
48
## Input
38
49
* * ` feat ` * : feature vector matrix ( Instance * x* Features )
39
50
* * ` label ` * : label matrix ( Instance * x* 1 )
@@ -55,113 +66,7 @@ FS = jfs('sma',feat,label,opts);
55
66
56
67
## Notation
57
68
Some methods have their specific parameters ( example: PSO, GA, DE ), and if you do not set them then they will be defined as default settings
58
- * you may open the * m.file* to view or change the parameters
59
69
* you may use * ` opts ` * to set the parameters of method ( see example 1 or refer [ here] ( /Description.md ) )
60
- * you may also change the fitness function in ` jFitnessFunction ` file
61
-
62
-
63
- ### Example 1 : Particle Swarm Optimization ( PSO )
64
- ``` code
65
- % Common parameter settings
66
- opts.k = 5; % Number of k in K-nearest neighbor
67
- opts.N = 10; % number of solutions
68
- opts.T = 100; % maximum number of iterations
69
- % Parameters of PSO
70
- opts.c1 = 2;
71
- opts.c2 = 2;
72
- opts.w = 0.9;
73
-
74
- % Load dataset
75
- load ionosphere.mat;
76
-
77
- % Ratio of validation data
78
- ho = 0.2;
79
- % Divide data into training and validation sets
80
- HO = cvpartition(label,'HoldOut',ho);
81
- opts.Model = HO;
82
-
83
- % Perform feature selection
84
- FS = jfs('pso',feat,label,opts);
85
-
86
- % Define index of selected features
87
- sf_idx = FS.sf;
88
-
89
- % Accuracy
90
- Acc = jknn(feat(:,sf_idx),label,opts);
91
-
92
- % Plot convergence
93
- plot(FS.c); grid on;
94
- xlabel('Number of Iterations');
95
- ylabel('Fitness Value');
96
- title('PSO');
97
- ```
98
-
99
- ### Example 2 : Slime Mould Algorithm ( SMA )
100
- ``` code
101
- % Common parameter settings
102
- opts.k = 5; % Number of k in K-nearest neighbor
103
- opts.N = 10; % number of solutions
104
- opts.T = 100; % maximum number of iterations
105
-
106
- % Load dataset
107
- load ionosphere.mat;
108
-
109
- % Ratio of validation data
110
- ho = 0.2;
111
- % Divide data into training and validation sets
112
- HO = cvpartition(label,'HoldOut',ho);
113
- opts.Model = HO;
114
-
115
- % Perform feature selection
116
- FS = jfs('sma',feat,label,opts);
117
-
118
- % Define index of selected features
119
- sf_idx = FS.sf;
120
-
121
- % Accuracy
122
- Acc = jknn(feat(:,sf_idx),label,opts);
123
-
124
- % Plot convergence
125
- plot(FS.c); grid on;
126
- xlabel('Number of Iterations');
127
- ylabel('Fitness Value');
128
- title('SMA');
129
- ```
130
-
131
- ### Example 3 : Whale Optimization Algorithm ( WOA )
132
- ``` code
133
- % Common parameter settings
134
- opts.k = 5; % Number of k in K-nearest neighbor
135
- opts.N = 10; % number of solutions
136
- opts.T = 100; % maximum number of iterations
137
- % Parameter of WOA
138
- opts.b = 1;
139
-
140
- % Load dataset
141
- load ionosphere.mat;
142
-
143
- % Ratio of validation data
144
- ho = 0.2;
145
- % Divide data into training and validation sets
146
- HO = cvpartition(label,'HoldOut',ho);
147
- opts.Model = HO;
148
-
149
- % Perform feature selection
150
- FS = jfs('woa',feat,label,opts);
151
-
152
- % Define index of selected features
153
- sf_idx = FS.sf;
154
-
155
- % Accuracy
156
- Acc = jknn(feat(:,sf_idx),label,opts);
157
-
158
- % Plot convergence
159
- plot(FS.c); grid on;
160
- xlabel('Number of Iterations');
161
- ylabel('Fitness Value');
162
- title('WOA');
163
- ```
164
-
165
70
166
71
## Requirement
167
72
@@ -178,6 +83,9 @@ title('WOA');
178
83
179
84
| No. | Abbreviation | Name | Year | Extra Parameters |
180
85
| -----| --------------| ---------------------------------------------------------------------------------------------| ------| ------------------|
86
+ | 46 | ` 'plo' ` | Polar Lights Optimizer | 2024 | Yes |
87
+ | 45 | ` 'coa' ` | Crayfish Optimization Algorithm | 2024 | Yes |
88
+ | 44 | ` 'rime' ` | RIME Optimization Algorithm | 2023 | Yes |
181
89
| 43 | ` 'mpa' ` | [ Marine Predators Algorithm] ( /Description.md#marine-predators-algorithm-mpa ) | 2020 | Yes |
182
90
| 42 | ` 'gndo' ` | Generalized Normal Distribution Optimization | 2020 | No |
183
91
| 41 | ` 'sma' ` | Slime Mould Algorithm | 2020 | No |
0 commit comments