Single-Iteration Learning Algorithm for Feed-Forward Neural Networks

PDF Version Also Available for Download.

Description

A new methodology for neural learning is presented, whereby only a single iteration is required to train a feed-forward network with near-optimal results. To this aim, a virtual input layer is added to the multi-layer architecture. The virtual input layer is connected to the nominal input layer by a specird nonlinear transfer function, and to the fwst hidden layer by regular (linear) synapses. A sequence of alternating direction singular vrdue decompositions is then used to determine precisely the inter-layer synaptic weights. This algorithm exploits the known separability of the linear (inter-layer propagation) and nonlinear (neuron activation) aspects of information &ansfer ... continued below

Physical Description

10 Pages

Creation Information

Barhen, J.; Cogswell, R. & Protopopescu, V. July 31, 1999.

Context

This article is part of the collection entitled: Office of Scientific & Technical Information Technical Reports and was provided by UNT Libraries Government Documents Department to Digital Library, a digital repository hosted by the UNT Libraries. More information about this article can be viewed below.

Who

People and organizations associated with either the creation of this article or its content.

Sponsor

Publisher

Provided By

UNT Libraries Government Documents Department

Serving as both a federal and a state depository library, the UNT Libraries Government Documents Department maintains millions of items in a variety of formats. The department is a member of the FDLP Content Partnerships Program and an Affiliated Archive of the National Archives.

Contact Us

What

Descriptive information to help identify this article. Follow the links below to find similar items on the Digital Library.

Description

A new methodology for neural learning is presented, whereby only a single iteration is required to train a feed-forward network with near-optimal results. To this aim, a virtual input layer is added to the multi-layer architecture. The virtual input layer is connected to the nominal input layer by a specird nonlinear transfer function, and to the fwst hidden layer by regular (linear) synapses. A sequence of alternating direction singular vrdue decompositions is then used to determine precisely the inter-layer synaptic weights. This algorithm exploits the known separability of the linear (inter-layer propagation) and nonlinear (neuron activation) aspects of information &ansfer within a neural network.

Physical Description

10 Pages

Source

  • 5th Conference of Information Systems Analysis and Synthesis (ISAS '99)/ 3rd Conference of Systemics Cybernetics, and Informatics (SCI '99), Orlando, FL, July 31-August 4, 1999

Language

Item Type

Identifier

Unique identifying numbers for this article in the Digital Library or other systems.

  • Other: DE00006257
  • Report No.: ORNL/CP-102700
  • Grant Number: AC05-96OR22464
  • Office of Scientific & Technical Information Report Number: 6257
  • Archival Resource Key: ark:/67531/metadc694628

Collections

This article is part of the following collection of related materials.

Office of Scientific & Technical Information Technical Reports

What responsibilities do I have when using this article?

When

Dates and time periods associated with this article.

Creation Date

  • July 31, 1999

Added to The UNT Digital Library

  • Aug. 14, 2015, 8:43 a.m.

Description Last Updated

  • June 10, 2016, 4:54 p.m.

Usage Statistics

When was this article last used?

Yesterday: 0
Past 30 days: 0
Total Uses: 4

Interact With This Article

Here are some suggestions for what to do next.

Start Reading

PDF Version Also Available for Download.

Citations, Rights, Re-Use

Barhen, J.; Cogswell, R. & Protopopescu, V. Single-Iteration Learning Algorithm for Feed-Forward Neural Networks, article, July 31, 1999; Oak Ridge, Tennessee. (digital.library.unt.edu/ark:/67531/metadc694628/: accessed September 24, 2017), University of North Texas Libraries, Digital Library, digital.library.unt.edu; crediting UNT Libraries Government Documents Department.