LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Fyfe, Colin; Gabrys, Bogdan (1999)
Publisher: Proceedings of International Conference on Neural Networks and Artificial Intelligence ICNNAI'99.
Languages: English
Types: Unknown
Subjects: aintel, csi

Classified by OpenAIRE into

arxiv: Computer Science::Machine Learning, Computer Science::Neural and Evolutionary Computation, Quantitative Biology::Neurons and Cognition
One of the major paradigms for unsupervised learning in Artificial Neural Networks is Hebbian learning. The standard implementations of Hebbian learning are optimal under the assumptions of Gaussian noise in a data set. We derive e-insensitive Hebbian learning based on minimising the least absolute error in a compressed data set and show that the learning rule is equivalent to the Principal Component Analysis (PCA) networks' learning rules under a variety of conditions.
  • No references.
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article