Feature Selection with Exhaustive Search
Source:R/FSelectorBatchExhaustiveSearch.R
mlr_fselectors_exhaustive_search.Rd
Feature Selection using the Exhaustive Search Algorithm. Exhaustive Search generates all possible feature sets.
Details
The feature selection terminates itself when all feature sets are evaluated. It is not necessary to set a termination criterion.
Control Parameters
max_features
integer(1)
Maximum number of features. By default, number of features in mlr3::Task.
Super classes
mlr3fselect::FSelector
-> mlr3fselect::FSelectorBatch
-> FSelectorBatchExhaustiveSearch
Examples
# Feature Selection
# \donttest{
# retrieve task and load learner
task = tsk("penguins")
learner = lrn("classif.rpart")
# run feature selection on the Palmer Penguins data set
instance = fselect(
fselector = fs("exhaustive_search"),
task = task,
learner = learner,
resampling = rsmp("holdout"),
measure = msr("classif.ce"),
term_evals = 10
)
# best performing feature set
instance$result
#> bill_depth bill_length body_mass flipper_length island sex year
#> <lgcl> <lgcl> <lgcl> <lgcl> <lgcl> <lgcl> <lgcl>
#> 1: TRUE TRUE FALSE FALSE FALSE FALSE FALSE
#> features n_features classif.ce
#> <list> <int> <num>
#> 1: bill_depth,bill_length 2 0.08695652
# all evaluated feature sets
as.data.table(instance$archive)
#> bill_depth bill_length body_mass flipper_length island sex year
#> <lgcl> <lgcl> <lgcl> <lgcl> <lgcl> <lgcl> <lgcl>
#> 1: TRUE FALSE FALSE FALSE FALSE FALSE FALSE
#> 2: FALSE TRUE FALSE FALSE FALSE FALSE FALSE
#> 3: FALSE FALSE TRUE FALSE FALSE FALSE FALSE
#> 4: FALSE FALSE FALSE TRUE FALSE FALSE FALSE
#> 5: FALSE FALSE FALSE FALSE TRUE FALSE FALSE
#> 6: FALSE FALSE FALSE FALSE FALSE TRUE FALSE
#> 7: FALSE FALSE FALSE FALSE FALSE FALSE TRUE
#> 8: TRUE TRUE FALSE FALSE FALSE FALSE FALSE
#> 9: TRUE FALSE TRUE FALSE FALSE FALSE FALSE
#> 10: TRUE FALSE FALSE TRUE FALSE FALSE FALSE
#> classif.ce runtime_learners timestamp batch_nr warnings errors
#> <num> <num> <POSc> <int> <int> <int>
#> 1: 0.24347826 0.003 2024-11-07 21:50:18 1 0 0
#> 2: 0.28695652 0.004 2024-11-07 21:50:18 1 0 0
#> 3: 0.30434783 0.004 2024-11-07 21:50:18 1 0 0
#> 4: 0.19130435 0.004 2024-11-07 21:50:18 1 0 0
#> 5: 0.23478261 0.004 2024-11-07 21:50:18 1 0 0
#> 6: 0.63478261 0.003 2024-11-07 21:50:18 1 0 0
#> 7: 0.63478261 0.004 2024-11-07 21:50:18 1 0 0
#> 8: 0.08695652 0.005 2024-11-07 21:50:18 1 0 0
#> 9: 0.23478261 0.005 2024-11-07 21:50:18 1 0 0
#> 10: 0.19130435 0.005 2024-11-07 21:50:18 1 0 0
#> features n_features resample_result
#> <list> <list> <list>
#> 1: bill_depth 1 <ResampleResult>
#> 2: bill_length 1 <ResampleResult>
#> 3: body_mass 1 <ResampleResult>
#> 4: flipper_length 1 <ResampleResult>
#> 5: island 1 <ResampleResult>
#> 6: sex 1 <ResampleResult>
#> 7: year 1 <ResampleResult>
#> 8: bill_depth,bill_length 2 <ResampleResult>
#> 9: bill_depth,body_mass 2 <ResampleResult>
#> 10: bill_depth,flipper_length 2 <ResampleResult>
# subset the task and fit the final model
task$select(instance$result_feature_set)
learner$train(task)
# }