Inicio  /  Information  /  Vol: 14 Par: 10 (2023)  /  Artículo
ARTÍCULO
TITULO

On the Use of Kullback?Leibler Divergence for Kernel Selection and Interpretation in Variational Autoencoders for Feature Creation

Fábio Mendonça    
Sheikh Shanawaz Mostafa    
Fernando Morgado-Dias and Antonio G. Ravelo-García    

Resumen

This study presents a novel approach for kernel selection based on Kullback?Leibler divergence in variational autoencoders using features generated by the convolutional encoder. The proposed methodology focuses on identifying the most relevant subset of latent variables to reduce the model?s parameters. Each latent variable is sampled from the distribution associated with a single kernel of the last encoder?s convolutional layer, resulting in an individual distribution for each kernel. Relevant features are selected from the sampled latent variables to perform kernel selection, which filters out uninformative features and, consequently, unnecessary kernels. Both the proposed filter method and the sequential feature selection (standard wrapper method) were examined for feature selection. Particularly, the filter method evaluates the Kullback?Leibler divergence between all kernels? distributions and hypothesizes that similar kernels can be discarded as they do not convey relevant information. This hypothesis was confirmed through the experiments performed on four standard datasets, where it was observed that the number of kernels can be reduced without meaningfully affecting the performance. This analysis was based on the accuracy of the model when the selected kernels fed a probabilistic classifier and the feature-based similarity index to appraise the quality of the reconstructed images when the variational autoencoder only uses the selected kernels. Therefore, the proposed methodology guides the reduction of the number of parameters of the model, making it suitable for developing applications for resource-constrained devices.