Pakistan Science Abstracts
Article details & metrics
No Detail Found!!
Metaheuristic evolution: advancing feedforward neural network optimization
Author(s):
1. Muhammad Nouman Atta: Institute of Computer Sciences and Information Technology (ICS/IT), The University of Agriculture, Peshawar, Pakistan
2. Abdullah Khan: Institute of Computer Sciences and Information Technology (ICS/IT), The University of Agriculture, Peshawar, Pakistan
3. Arshad Khan: Institute of Computer Sciences and Information Technology (ICS/IT), The University of Agriculture, Peshawar, Pakistan
4. M. Imran: Institute of Computer Sciences and Information Technology (ICS/IT), The University of Agriculture, Peshawar, Pakistan
Abstract:
This paper summarizes two decades of research into the optimization of feedforward neural networks (FNNs) using metaheuristic approaches, highlighting the shift from traditional gradient-based methods to innovative metaheuristic algorithms. Traditional optimization techniques such as backpropagation and its variants have been foundational in training FNNs, primarily focusing on optimizing network weights. These methods, while effective in certain scenarios, predominantly suffer from convergence to local minima and lack the ability to explore solution spaces comprehensively. Metaheuristic algorithms, which include genetic algorithms, particle swarm optimization, and ant colony optimization, introduce a robust alternative that enhances both the exploitation and exploration capabilities in the optimization process. These strategies are adept at handling multiple components of FNNs simultaneously, which is crucial for achieving optimal network architecture and parameter settings. The versatility of metaheuristics supports a wide range of adaptations, from evolving network topologies to optimizing learning parameters, thus addressing the generalization challenges inherent in FNN applications. Several innovative FNN designs have emerged from the application of metaheuristics, such as EPNet for dynamic architecture optimization and cooperative coevolution frameworks, which facilitate the development of complex and adaptive neural models. Furthermore, the review discusses the formulation of FNN optimization as a multi objective problem, where solutions must balance between accuracy, network complexity, and other relevant metrics. The integration of metaheuristic methods into FNN design not only broadens the scope of neural network applications but also sets the stage for future research directions. These include addressing big data challenges and improving data quality handling, promising to revolutionize the fields of neural computing and machine learning. This review underscores the continuous evolution and significance of metaheuristic methodologies in shaping the future of feedforward neural network optimization.
Page(s): 1-1
DOI: DOI not available
Published: Journal: Second International Conference on Computing Technologies, Tools and Applications (ICTAPP-24), June 4-6,2024 (Abstract Book), Volume: 0, Issue: 0, Year: 2024
Keywords:
backpropagation , Optimization Techniques , Metaheuristics , Metaheuristic Algorithms
References:
References are not available for this document.
Citations
Citations are not available for this document.
0

Citations

0

Downloads

14

Views