NotesFAQContact Us
Collection
Advanced
Search Tips
ERIC Number: ED557636
Record Type: Non-Journal
Publication Date: 2014
Pages: 181
Abstractor: As Provided
Reference Count: N/A
ISBN: 978-1-3039-9415-9
ISSN: N/A
Achieving Consistent Near-Optimal Pattern Recognition Accuracy Using Particle Swarm Optimization to Pre-Train Artificial Neural Networks
Nikelshpur, Dmitry O.
ProQuest LLC, D.P.S. Dissertation, Pace University
Similar to mammalian brains, Artificial Neural Networks (ANN) are universal approximators, capable of yielding near-optimal solutions to a wide assortment of problems. ANNs are used in many fields including medicine, internet security, engineering, retail, robotics, warfare, intelligence control, and finance. "ANNs have a tendency to get trapped at sub-optimal solutions (local optimum)," and therefore trial and error is commonly used to select the network topology and train the network, which is prohibitively time consuming and costly. Recent advances in our understanding of the biological brain, hardware, algorithms, and potential for novel applications renewed interest in ANNs. Evolutionary Artificial Neural Networks (EANN) are among the more successful paradigms explored to improve ANNs' performance. EANNs employ evolutionary computation techniques such as Genetic Algorithms (GA) or Particle Swarm Optimization (PSO) to train ANNs, or to generate ANNs' topologies. Still, these improvements are "not consistent" and usually problem-specific. ANN performance depends in part on the number of neurons in hidden layer(s). The more neurons in hidden layer(s) the better the network's ability to recognize specific samples it had seen during the training phase; however, the network becomes incapable of learning general patterns and recognizing these patterns in novel data. Performance on training data improves with training, while performance on testing data (samples the network had not seen previously) degrades (overfitting). This work rigorously investigated using PSO to pre-train ANNs with varying number of neurons in the hidden layer. It was shown that using PSO algorithm to pre-train ANNs improves classification accuracy for diverse problems, and, most notably, a PSO parameter configuration was developed that "consistently yielded near-optimal solutions for pattern recognition problems from different domains." It was also shown how an automated algorithm for efficiently evolving optimal ANN size and topology can be designed to take advantage of the study's findings. Furthermore, a novel biologically inspired hybrid of a visual ventral stream object recognition model was investigated--Hierarchical Model and X (HMAX), PSO, and feed-forward back-propagation ANN. This or similar biologically plausible models may be useful in vivo studies as well as in vitro simulations and applications. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com/en-US/products/dissertations/individuals.shtml.]
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A