Feature-Aware Drop Layer (FADL): A Nonparametric Neural Network Layer for Feature Selection

17th International Conference on Soft Computing Models in Industrial and Environmental Applications (SOCO 2022)(2022)

引用 2|浏览9
暂无评分
摘要
Neural networks have proven to be a good alternative in application fields such as healthcare, time-series forecasting and artificial vision, among others, for tasks like regression or classification. Their potential has been particularly remarkable in unstructured data, but recently developed architectures or their ensemble with other classical methods have produced competitive results in structured data. Feature selection has several beneficial properties: improve efficacy, performance, problem understanding and data recollection time. However, as new data sources become available and new features are generated using feature engineering techniques, more computational resources are required for feature selection methods. Feature selection takes an exorbitant amount of time in datasets with numerous features, making it impossible to use or achieving suboptimal selections that do not reflect the underlying behavior of the problem. We propose a nonparametric neural network layer which provides all the benefits of feature selection while requiring few changes to the architecture. Our method adds a novel layer at the beginning of the neural network, which removes the influence of features during training, adding inherent interpretability to the model without extra parameterization. In contrast to other feature selection methods, we propose an efficient and model-aware method to select the features with no need to train the model several times. We compared our method with a variety of popular feature selection strategies and datasets, showing remarkable results
更多
查看译文
关键词
Feature selection, Neural network, Classification, Regression
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要