REVISTA
AI

   
Redirigiendo al acceso original de articulo en 17 segundos...
Inicio  /  AI  /  Vol: 4 Par: 3 (2023)  /  Artículo
ARTÍCULO
TITULO

Training Artificial Neural Networks Using a Global Optimization Method That Utilizes Neural Networks

Ioannis G. Tsoulos and Alexandros Tzallas    

Resumen

Perhaps one of the best-known machine learning models is the artificial neural network, where a number of parameters must be adjusted to learn a wide range of practical problems from areas such as physics, chemistry, medicine, etc. Such problems can be reduced to pattern recognition problems and then modeled from artificial neural networks, whether these problems are classification problems or regression problems. To achieve the goal of neural networks, they must be trained by appropriately adjusting their parameters using some global optimization methods. In this work, the application of a recent global minimization technique is suggested for the adjustment of neural network parameters. In this technique, an approximation of the objective function to be minimized is created using artificial neural networks and then sampling is performed from the approximation function and not the original one. Therefore, in the present work, learning of the parameters of artificial neural networks is performed using other neural networks. The new training method was tested on a series of well-known problems, a comparative study was conducted against other neural network parameter tuning techniques, and the results were more than promising. From what was seen after performing the experiments and comparing the proposed technique with others that have been used for classification datasets as well as regression datasets, there was a significant difference in the performance of the proposed technique, starting with 30% for classification datasets and reaching 50% for regression problems. However, the proposed technique, because it presupposes the use of global optimization techniques involving artificial neural networks, may require significantly higher execution time than other techniques.

 Artículos similares

       
 
Shubin Wang, Yuanyuan Chen and Zhang Yi    
Diabetic retinopathy is a prevalent eye disease that poses a potential risk of blindness. Nevertheless, due to the small size of diabetic retinopathy lesions and the high interclass similarity in terms of location, color, and shape among different lesion... ver más
Revista: Applied Sciences

 
Íñigo Manuel Iglesias-Sanfeliz Cubero, Andrés Meana-Fernández, Juan Carlos Ríos-Fernández, Thomas Ackermann and Antonio José Gutiérrez-Trashorras    
Revista: Applied Sciences

 
Binghui Zhao, Liguo Han, Pan Zhang, Qiang Feng and Liyun Ma    
In passive seismic exploration, the number and location of underground sources are very random, and there may be few passive sources or an uneven spatial distribution. The random distribution of seismic sources can cause the virtual shot recordings to pr... ver más
Revista: Applied Sciences

 
Lei Yang, Mengxue Xu and Yunan He    
Convolutional Neural Networks (CNNs) have become essential in deep learning applications, especially in computer vision, yet their complex internal mechanisms pose significant challenges to interpretability, crucial for ethical applications. Addressing t... ver más
Revista: Applied Sciences

 
Antonello Pasini and Stefano Amendola    
Neural network models are often used to analyse non-linear systems; here, in cases of small datasets, we review our complementary approach to deep learning with the purpose of highlighting the importance and roles (linear, non-linear or threshold) of cer... ver más
Revista: Applied Sciences