Language:EN
Pages: 2
Rating : ⭐⭐⭐⭐⭐
Price: $10.99
Page 1 Preview
the architectural design the network both ann mode

The architectural design the network both ann models

72 Dorado, Pedreira & Miguélez

Introduction

Some of the most important ANNs are those as recurrent ANNs (RANNs) (Haykin, 1999) that tackle temporal problems, quite common in the real world, and different from the classical type of problems of non-temporal classification performed by static ANNs. However, the difficulties for their implementation induced the use of tricks as delays (TDNN) or feed-forward networks from recurrent networks (BPTT) for solving dynamic problems. On reaching maturity, the recurrent ANNs that use RTRL work better with dynamic phenomena than the classical ones, although they still have some problems of design and training convergence.

With regards to the first of these problems, the architectural design of the network —in both ANN models, feed-forward and Recurrent — the existence of a vast amount of design possibilities allows experimentation but also sets out the doubt about which might be the best of combinations among design and training parameters. Unfortunately, there is not a mathematical basis that might back the selection of a specific architecture, and only few works (Lapedes & Farber, 1988; Cybenko, 1989) have shown lower and upper limits for PE number at some models and with restricted types of problems. Apart from these works, there are only empirical studies (Yee, 1992) about this subject. Due to this situation, it cannot be said for sure that the architecture selected is the most suitable one without performing exhaustive architectural tests. Besides, nowadays there is a clear

Regarding the training of the ANNs, the classical algorithms, which are based on the gradient descent, are quite sensitive to local minimums of the search space. Moreover, in order to achieve the network convergence, the designer has to configure another set of parameters that are involved in training, such as learning rate and individual param-eters that belong to every algorithm. Another intrinsic problem of all learning algorithms is that they are not easily adaptable to the working modifications of both, PE and connections; therefore, this would prevent the development and the implementation of new characteristics and improvements at ANNs models. One of the possible improve-ments might be the incorporation of biological characteristics, similar to those of natural neural cells, for a better understanding and functionality of the artificial neurons.

A possible solution to these problems is the use of new optimisation techniques. Genetic Algorithms (GA) (Fogel, Fogel, & Porto, 1990; Yao, 1992) are easy-functioning optimisation techniques of EC that achieve good results. They have been applied, as well as other EC techniques, to the architectural adjustment for years (Robbins, Hughes, Plumbley, Fallside, & Prager, 1993), so that they represent an open field for investigation.

You are viewing 1/3rd of the document.Purchase the document to get full access instantly

Immediately available after payment
Both online and downloadable
No strings attached
How It Works
Login account
Login Your Account
Place in cart
Add to Cart
send in the money
Make payment
Document download
Download File
img

Uploaded by : Wayne Mendez

PageId: ELI744ECDB