Mezcla de expertos superpuestos con penalización entrópica

In these days, there are a growing interest in pattern recognition for tasks as predicting weather events, recommending best routes, intrusion detection or face detection. These tasks can be modelled as a classification problem, where a common alternative is using an ensemble model of classification...

Descripción completa

Detalles Bibliográficos
Autores principales: Peralta, Billy, Saavedra, Ariel, Caro, Luis
Formato: Objeto de conferencia
Lenguaje:Español
Publicado: 2017
Materias:
Acceso en línea:http://sedici.unlp.edu.ar/handle/10915/63287
http://www.clei2017-46jaiio.sadio.org.ar/sites/default/files/Mem/SLMDI/SLMDI-14.pdf
Aporte de:
id I19-R120-10915-63287
record_format dspace
institution Universidad Nacional de La Plata
institution_str I-19
repository_str R-120
collection SEDICI (UNLP)
language Español
topic Ciencias Informáticas
mixture of experts model
Network Architecture and Design
spellingShingle Ciencias Informáticas
mixture of experts model
Network Architecture and Design
Peralta, Billy
Saavedra, Ariel
Caro, Luis
Mezcla de expertos superpuestos con penalización entrópica
topic_facet Ciencias Informáticas
mixture of experts model
Network Architecture and Design
description In these days, there are a growing interest in pattern recognition for tasks as predicting weather events, recommending best routes, intrusion detection or face detection. These tasks can be modelled as a classification problem, where a common alternative is using an ensemble model of classification. An usual ensemble model is given by Mixture of Experts model, which belongs to modular artificial neural networks consisting of two subcomponents type: networks of experts and Gating network, and whose combination creates an environment of competition among experts seeking to obtain patterns of the data source, in order to specialize in that particular task, all this supervised in the Gating network, which is the mediator agent and ponders the quality delivered by each expert model solution. We observe that this architecture assume that one gate influence one data point, consequently the training can be misleading to real datasets where the data is better explained by multiple experts. In this work, we present a variant of traditional MoE model, which consists of maximizing the entropy of the evaluation function in the Gating network in conjunction with standard error minimization. The results show the advantage of our approach in multiple datasets in terms of accuracy metric. As a future work, we plan to apply this idea to the Mixture-of-Experts with embedded feature selection.
format Objeto de conferencia
Objeto de conferencia
author Peralta, Billy
Saavedra, Ariel
Caro, Luis
author_facet Peralta, Billy
Saavedra, Ariel
Caro, Luis
author_sort Peralta, Billy
title Mezcla de expertos superpuestos con penalización entrópica
title_short Mezcla de expertos superpuestos con penalización entrópica
title_full Mezcla de expertos superpuestos con penalización entrópica
title_fullStr Mezcla de expertos superpuestos con penalización entrópica
title_full_unstemmed Mezcla de expertos superpuestos con penalización entrópica
title_sort mezcla de expertos superpuestos con penalización entrópica
publishDate 2017
url http://sedici.unlp.edu.ar/handle/10915/63287
http://www.clei2017-46jaiio.sadio.org.ar/sites/default/files/Mem/SLMDI/SLMDI-14.pdf
work_keys_str_mv AT peraltabilly mezcladeexpertossuperpuestosconpenalizacionentropica
AT saavedraariel mezcladeexpertossuperpuestosconpenalizacionentropica
AT caroluis mezcladeexpertossuperpuestosconpenalizacionentropica
bdutipo_str Repositorios
_version_ 1764820480660865026