Hi
I am aware that Feature Based Selection kind of measure the prediction power of each variable before any model has been built.
I know that Permutation Feature importance measures, once the model has been built, how relevant each variable is. Thanks to this we can prune our model and strike a good balance between accuracy and simplicity. Fair enough.
However, most of the time I encounter contradictory messages. There is a variable with 0 importance on the Feature Based Selection but it ends up becoming one of the most relevant variables of my model according to its ranking on Permutation Feature importance.
So I guess I cannot rely on what Feature Based Selection says in order to do a preliminary assessment. I guess I should see it as some theoretical exercise.
Thank you