LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Neto, João Pedro; Costa, José Félix (1999)
Publisher: Department of Informatics, University of Lisbon
Languages: Portuguese
Types: Report
Subjects:
In a recent paper [Neto et al. 97] we showed that programming languages can be translated on recurrent (analog, rational weighted) neural nets. The goal was not efficiency but simplicity. Indeed we used a number-theoretic approach to machine programming, where (integer) numbers were coded in a unary fashion, introducing a exponential slow down in the computations, with respect to a two-symbol tape Turing machine. Implementation of programming languages in neural nets turns to be not only theoretical exciting, but has also some practical implications in the recent efforts to merge symbolic and subsymbolic computation. To be of some use, it should be carried in a context of bounded resources. Herein, we show how to use resource boundedness to speed up computations over neural nets, through suitable data type coding like in the usual programming languages. We introduce data types and show how to code and keep them inside the information flow of neural nets. Data types and control structures are part of a suitable programming language called netdef. Each netdef program has a specific neural net that computes it. These nets have a strong modular structure and a synchronisation mechanism allowing sequential or parallel execution of subnets, despite the massive parallel feature of neural nets. Each instruction denotes an independent neural net. There are constructors for assignment, conditional and loop instructions. Besides the language core, many other features are possible using the same method. There is also a netdef compiler, available at www.di.fc.ul.pt/~jpn/netdef/netdef.htm
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [GRUAU et al. 95] Gruau, F.; Ratajszczak, J. and Wiber, G., A Neural Compiler, Theoretical Computer Science, [141] (1-2), 1995, 1-52.
    • [LESTER 93] Lester, B. P., The Art of Parallel Programming, 1993, Prentice Hall.
    • [MCCULLOCH and PITTS 43] McCulloch, W. and Pitts, W., A Logical Calculus of the Ideas Immanent in Nervous Activity, Bulletin of Mathematical Biophysics, 5, 1943, 115-133.
    • [MINSKY 67] Minsky, M., Computation: Finite and Infinite Machines, Prentice Hall, 1967.
    • [NETO et al. 97] Neto, J. P., Siegelmann, H., Costa, J. F., Araujo, C. S., Turing Universality of Neural Nets (revisited), Lecture Notes in Computer Science - 1333, SpringerVerlag, 1997, 361-366.
    • [SIEGELMANN and SONTAG 95] Siegelmann, H. and Sontag, E., On the Computational Power of Neural Nets, Journal of Computer and System Sciences [50] 1, Academic Press, 1995, 132-150.
    • [SIEGELMANN 96] Siegelmann, H., On NIL: The Software Constructor of Neural Networks, Parallel Processing Letters, [6] 4, World Scientific Publishing Company, 1996, 575-582.
    • [SIEGELMANN 99] Siegelmann, H., Neural Networks and Analog Computation, Beyond the Turing Limit, Birkhauser, 1999.
    • [SGS-THOMSOM 95] SGS-THOMSON, Occam® 2.1 Reference Manual, 1995.
    • [SONTAG 90] Sontag, E., Mathematical Control Theory: Deterministic Finite Dimensional Systems, Springer-Verlag, New York, 1990.
    • [USDD 83] United States Department of Defence, Reference Manual for the Ada® Programming Language, American National Standards Institute Inc., 1983.
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article