Feature extraction and selection using statistical dependence criteria

Dimensionality reduction using feature extraction and selection approaches is a common stage of many regression and classification tasks. In recent years there have been significant e orts to reduce the dimension of the feature space without lossing information that is relevant for prediction. This...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Tomassi, Diego, Marx, Nicolás, Beauseroy, Pierre
Formato: Objeto de conferencia
Lenguaje:Inglés
Publicado: 2016
Materias:
Acceso en línea:http://sedici.unlp.edu.ar/handle/10915/56980
http://45jaiio.sadio.org.ar/sites/default/files/ASAI-13_0.pdf
Aporte de:
Descripción
Sumario:Dimensionality reduction using feature extraction and selection approaches is a common stage of many regression and classification tasks. In recent years there have been significant e orts to reduce the dimension of the feature space without lossing information that is relevant for prediction. This objective can be cast into a conditional independence condition between the response or class labels and the transformed features. Building on this, in this work we use measures of statistical dependence to estimate a lower-dimensional linear subspace of the features that retains the su cient information. Unlike likelihood-based and many momentbased methods, the proposed approach is semi-parametric and does not require model assumptions on the data. A regularized version to achieve simultaneous variable selection is presented too. Experiments with simulated data show that the performance of the proposed method compares favorably to well-known linear dimension reduction techniques.