Introduction to NeuNet Pro
Table Of Contents

Introduction

What is Neural Net?

Suggested Uses

SFAM Classification

Back Propagation

NeuNet Overview

Back Propagation Neural Networks

Back Propagation (BackProp) is the most common type of neural network. The algorithm was provided and popularized by Rumelhart, Hinton and Williams in 1986, following work by Parker, LeCun, Werbos and Rosenblatt.

BackProp makes its predictions as numeric values, not as class names. BackProp is well suited for predicting continuous numerical values such as prices, weights and times. BackProp can also be used for classification problems, where each class is assigned a numeric value.

The following points reflect our thoughts after several years experience using BackProp.

BackProp Strengths:

  • Training is very tolerant of anomalies and noisy data.
  • Interpolation between data points is excellent.

BackProp Weaknesses:

  • Training can be quite slow, requiring thousands of cycles through the training data.
  • The user is required to make some configuration decisions prior to beginning training.
  • A certain amount of user intervention is required during the training process.
  • A large amount of training data may be required in order to discourage the discovery of spurious correlation, especially in data that is noisy or has many inputs.

A Typical BackProp Neural Network

(Used For Predicting Values)


    Training begins with all weights set to random numbers. For each data record, the predicted value is compared to the desired (actual) value and the weights are adjusted to move the prediction closer to the desired value. Many cycles are made through the entire set of training data with the weights being continually adjusted to produce more accurate predictions.




A Complete Neural Network Development System

CorMac Technologies Inc.
34 North Cumberland Street ~ Thunder Bay ON P7A 4L3 ~ Canada
E m a i l