Pages: 2
Rating : ⭐⭐⭐⭐⭐
Price: $10.99
Page 1 Preview
when present the input and the output patterns

When present the input and the output patterns

2 role/filler

probe sentence gestalt
hidden layer
input word context

The approach presented here takes another direction, consistent with the principle of gradual language evolution and learning, by processing and evolving language items of increasing complexity. Another problem with the solution presented by Gasser is that because the packing and unpacking processes are split, this method requires the training sequences during both learning tasks, which is less plausible and increases the learning time. The sequential autoassociative task in my approach requires only a short-term memory to keep the sequence to be presented for a second time at the output layer and the system only has to perceive the input the environment provides throughout the learning (which may last indefinitely: we can keep the learning going non-stop).

© 2001 by CRC Press LLC

© 2001 by CRC Press LLC

You are viewing 1/3rd of the document.Purchase the document to get full access instantly

Immediately available after payment
Both online and downloadable
No strings attached
How It Works
Login account
Login Your Account
Place in cart
Add to Cart
send in the money
Make payment
Document download
Download File

Uploaded by : Lynn Mitchell

PageId: ELI4B4506B