Skip to content

Commit 072dc60

Browse files
committed
Merge tag '2.0.0' of github.com:Ruiruiz30/Newer-Wrapper-Feature-Selection-Toolbox
2 parents ad8b80c + 911c49d commit 072dc60

File tree

2 files changed

+22
-113
lines changed

2 files changed

+22
-113
lines changed

LICENSE

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
BSD 3-Clause License
22

3+
Copyright (c) 2024, Jinrui Zhang
34
Copyright (c) 2020, Jingwei Too
45
All rights reserved.
56

README.md

Lines changed: 21 additions & 113 deletions
Original file line numberDiff line numberDiff line change
@@ -1,21 +1,25 @@
1-
# Jx-WFST : A Wrapper Feature Selection Toolbox
1+
# A Newer Wrapper Feature Selection Toolbox improved by Jx-WFST.
22

3-
[![View Wrapper Feature Selection Toolbox on File Exchange](https://www.mathworks.com/matlabcentral/images/matlab-file-exchange.svg)](https://www.mathworks.com/matlabcentral/fileexchange/84139-wrapper-feature-selection-toolbox)
4-
[![License](https://img.shields.io/badge/license-BSD_3-yellow.svg)](https://github.com/JingweiToo/Wrapper-Feature-Selection-Toolbox/blob/main/LICENSE)
5-
[![GitHub release](https://img.shields.io/badge/release-1.1.1-green.svg)](https://github.com/JingweiToo/Wrapper-Feature-Selection-Toolbox)
3+
[![View Wrapper Feature Selection Toolbox on File Exchange](https://www.mathworks.com/matlabcentral/images/matlab-file-exchange.svg)](https://ww2.mathworks.cn/matlabcentral/fileexchange/178129-newer-wrapper-feature-selection-toolbox)
4+
[![License](https://img.shields.io/badge/license-BSD_3-yellow.svg)](https://github.com/Ruiruiz30/Newer-Wrapper-Feature-Selection-Toolbox/blob/main/LICENSE)
5+
[![GitHub release](https://img.shields.io/badge/release-2.0.0-green.svg)](https://github.com/Ruiruiz30/Newer-Wrapper-Feature-Selection-Toolbox/blob/main/README.md)
66

77
---
88
> "Toward Talent Scientist: Sharing and Learning Together"
99
> --- [Jingwei Too](https://jingweitoo.wordpress.com/)
1010
---
11-
11+
---
12+
> The purpose of scientific research is for development, not SCI.
13+
> --- [Jinrui Zhang]()
14+
---
1215
![Wheel](https://www.mathworks.com/matlabcentral/mlc-downloads/downloads/5dc2bdb4-ce4b-4e0e-bd6e-0237ff6ddde1/f9a9e760-64b9-4e31-9903-dffcabdf8be6/images/1607601518.JPG)
1316

1417

1518
## Introduction
1619

1720
* This toolbox offers more than 40 wrapper feature selection methods
18-
* The `A_Main` file provides the examples of how to apply these methods on benchmark dataset
21+
* In this toolbox, you can run multiple algorithms together and compare their performance
22+
* This toolbox adds the calculation of F-measure and Precision on v1.0
1923
* Source code of these methods are written based on pseudocode & paper
2024

2125

@@ -33,7 +37,14 @@ FS = jfs('pso',feat,label,opts);
3337
```code
3438
FS = jfs('sma',feat,label,opts);
3539
```
36-
40+
* `jfs` contains the names of all the algorithms available in the toolbox and points to the corresponding functions
41+
* You can specify which algorithms you want to run at the same time in `get_Algorithm`
42+
* You can specify the dataset you want to process in `get_Dataset`
43+
* With `A_get_one_data`, you can use one algorithm to perform Feature selection on multiple datasets and visualize data such as Accuracy, Feature size, etc.
44+
* Using `A_get_all_data`, you can run multiple algorithms simultaneously for feature selection on multiple datasets and compute Fitness, Accuracy, F-measure, Precision, etc., along with Friedman ranking
45+
* Using `A_Fitness_Wilcoxon`, you can perform the wilcoxon rank-sum test on multiple algorithms
46+
* The parameters in the code are commented
47+
3748
## Input
3849
* *`feat`* : feature vector matrix ( Instance *x* Features )
3950
* *`label`* : label matrix ( Instance *x* 1 )
@@ -55,113 +66,7 @@ FS = jfs('sma',feat,label,opts);
5566

5667
## Notation
5768
Some methods have their specific parameters ( example: PSO, GA, DE ), and if you do not set them then they will be defined as default settings
58-
* you may open the *m.file* to view or change the parameters
5969
* you may use *`opts`* to set the parameters of method ( see example 1 or refer [here](/Description.md) )
60-
* you may also change the fitness function in `jFitnessFunction` file
61-
62-
63-
### Example 1 : Particle Swarm Optimization ( PSO )
64-
```code
65-
% Common parameter settings
66-
opts.k = 5; % Number of k in K-nearest neighbor
67-
opts.N = 10; % number of solutions
68-
opts.T = 100; % maximum number of iterations
69-
% Parameters of PSO
70-
opts.c1 = 2;
71-
opts.c2 = 2;
72-
opts.w = 0.9;
73-
74-
% Load dataset
75-
load ionosphere.mat;
76-
77-
% Ratio of validation data
78-
ho = 0.2;
79-
% Divide data into training and validation sets
80-
HO = cvpartition(label,'HoldOut',ho);
81-
opts.Model = HO;
82-
83-
% Perform feature selection
84-
FS = jfs('pso',feat,label,opts);
85-
86-
% Define index of selected features
87-
sf_idx = FS.sf;
88-
89-
% Accuracy
90-
Acc = jknn(feat(:,sf_idx),label,opts);
91-
92-
% Plot convergence
93-
plot(FS.c); grid on;
94-
xlabel('Number of Iterations');
95-
ylabel('Fitness Value');
96-
title('PSO');
97-
```
98-
99-
### Example 2 : Slime Mould Algorithm ( SMA )
100-
```code
101-
% Common parameter settings
102-
opts.k = 5; % Number of k in K-nearest neighbor
103-
opts.N = 10; % number of solutions
104-
opts.T = 100; % maximum number of iterations
105-
106-
% Load dataset
107-
load ionosphere.mat;
108-
109-
% Ratio of validation data
110-
ho = 0.2;
111-
% Divide data into training and validation sets
112-
HO = cvpartition(label,'HoldOut',ho);
113-
opts.Model = HO;
114-
115-
% Perform feature selection
116-
FS = jfs('sma',feat,label,opts);
117-
118-
% Define index of selected features
119-
sf_idx = FS.sf;
120-
121-
% Accuracy
122-
Acc = jknn(feat(:,sf_idx),label,opts);
123-
124-
% Plot convergence
125-
plot(FS.c); grid on;
126-
xlabel('Number of Iterations');
127-
ylabel('Fitness Value');
128-
title('SMA');
129-
```
130-
131-
### Example 3 : Whale Optimization Algorithm ( WOA )
132-
```code
133-
% Common parameter settings
134-
opts.k = 5; % Number of k in K-nearest neighbor
135-
opts.N = 10; % number of solutions
136-
opts.T = 100; % maximum number of iterations
137-
% Parameter of WOA
138-
opts.b = 1;
139-
140-
% Load dataset
141-
load ionosphere.mat;
142-
143-
% Ratio of validation data
144-
ho = 0.2;
145-
% Divide data into training and validation sets
146-
HO = cvpartition(label,'HoldOut',ho);
147-
opts.Model = HO;
148-
149-
% Perform feature selection
150-
FS = jfs('woa',feat,label,opts);
151-
152-
% Define index of selected features
153-
sf_idx = FS.sf;
154-
155-
% Accuracy
156-
Acc = jknn(feat(:,sf_idx),label,opts);
157-
158-
% Plot convergence
159-
plot(FS.c); grid on;
160-
xlabel('Number of Iterations');
161-
ylabel('Fitness Value');
162-
title('WOA');
163-
```
164-
16570

16671
## Requirement
16772

@@ -178,6 +83,9 @@ title('WOA');
17883

17984
| No. | Abbreviation | Name | Year | Extra Parameters |
18085
|-----|--------------|---------------------------------------------------------------------------------------------|------|------------------|
86+
| 46 | `'plo'` | Polar Lights Optimizer | 2024 | Yes |
87+
| 45 | `'coa'` | Crayfish Optimization Algorithm | 2024 | Yes |
88+
| 44 | `'rime'` | RIME Optimization Algorithm | 2023 | Yes |
18189
| 43 | `'mpa'` | [Marine Predators Algorithm](/Description.md#marine-predators-algorithm-mpa) | 2020 | Yes |
18290
| 42 | `'gndo'` | Generalized Normal Distribution Optimization | 2020 | No |
18391
| 41 | `'sma'` | Slime Mould Algorithm | 2020 | No |

0 commit comments

Comments
 (0)