Your Features Are Important? It Doesn't Mean They Are Good |

Your Features Are Important? It Doesn't Mean They Are Good | by Samuele Mazzanti | Aug, 2023 | | #1 NEWS SOURCE FOR PEOPLE OF COLOR ON EARTH !!!!!

"Feature Importance" is not enough. You also need to look at "Error Contribution" if you want to know which features are beneficial for your model. The concept of "feature importance" is widely used in machine learning as the most basic type of model explainability. For example, it is used in Recursive Feature Elimination (RFE), to iteratively drop the least important feature of the model. However, there is a misconception about it. The fact that a feature is important doesn't imply that it is beneficial for the model! Indeed, when we say that a feature is important, this simply means that the feature brings a high contribution to the predictions made by the model. But we should consider that such contribution may be wrong. Take a simple example: a data scientist accidentally forgets the Customer ID between its model's features. The model uses Customer ID as a highly predictive feature. As a consequence, this feature will have a high feature importance

Related Keywords

, Recursive Feature Elimination , Teenage News , News , Ids News , Eenage , Top News , Breaking News , World News , Latest World News , World Breaking News , Global News , International News , News Online , Bs News , Orld News ,

© 2025 Vimarsana