Page 193 - DCAP208_Management Support Systems
P. 193

Management Support Systems




                    Notes
                                          Example: A historical set of loan applications with the success or failure of the individual
                                   to repay the loan has a set of input parameters and presumed known outputs.
                                   In one type, the difference between the desired and actual outputs is used to calculate corrections
                                   to the weights of the neural network. A variation of this approach simply acknowledges for each
                                   input trial whether the output is correct as the network adjusts weights in an attempt to achieve
                                   correct results.
                                   Examples of this type of learning are backpropagation and the Hopfield network.
                                   In unsupervised learning, only input stimuli are shown to the network. The network is
                                   self-organizing; that is, it organizes itself internally so that each hidden processing element
                                   responds strategically to a different set of input stimuli (or groups of stimuli). No knowledge is
                                   supplied about which classifications (i.e., outputs) are correct, and those that the network derives
                                   may or may not be meaningful to the network developer (this is useful for cluster analysis).
                                   However, by setting model parameters, we can control the number of categories into which a
                                   network classifies the inputs. Regardless, a human must examine the final categories to assign
                                   meaning and determine the usefulness of the results.

                                   Examples of this type of learning are Adaptive Resonance Theory (ART) (i.e., a neural network
                                   architecture that is aimed at being brain-like in unsupervised mode) and Kohonen self-organizing
                                   feature maps (i.e., neural network models for machine learning).
                                   The backpropagation learning algorithm is the standard way of implementing supervised
                                   training of feedforward neural networks. It is an iterative gradient-descent technique designed
                                   to minimize an error function between the actual output of the network and its desired output,
                                   as specified in the training set of data. Adjustment of the interconnection weights, which contain
                                   the mapping function per se, starts at the output node where the error measure is initially
                                   calculated and is then propagated back through the layers of the network, toward the input
                                   layer.

                                   Self Assessment

                                   Fill in the blanks:
                                   1.  ................... represent a brain metaphor for information processing.

                                   2.  ANN are composed of interconnected, simple processing elements called ...................
                                   3.  Neurons are partitioned into groups called ...................
                                   4.  A ................... is able to increase or decrease the strength of the connection from neuron to
                                       neuron.
                                   5.  The ................... propagation paradigm, allows all neurons to link the output in one layer
                                       to the input of the next layer, but it does not allow any feedback linkage.

                                   6.  A ................... is a layer of neurons that takes input from the previous layer and converts
                                       those inputs into outputs for further processing.
                                   7.  The ................... of a network contain the solution to a problem.

                                   8.  ................... express the relative strength of the input data or the many connections that
                                       transfer data from layer to layer.
                                   9.  The ................... function computes the weighted sums of all the input elements entering
                                       each processing element.





          186                               LOVELY PROFESSIONAL UNIVERSITY
   188   189   190   191   192   193   194   195   196   197   198