Page 190 - DCAP208_Management Support Systems
P. 190

Unit 11: Neural Networks




          nonlinear. The relationship is expressed by one of several types of transformation (transfer)  Notes
          functions. The transformation (transfer) function combines (i.e., adds up) the inputs coming into
          a neuron from other neurons/sources and then produces an output based on the choice of the
          transfer function. Selection of the specific function affects the network’s operation. The sigmoid
          (logical activation) function (or sigmoid transfer function) is an S-shaped transfer function in the
          range of 0 to 1, and it is a popular as well as useful nonlinear transfer function:



          where YT is the transformed (i.e., normalized) value of Y (see Figure 11.5).
                                 Figure 11.5: Example of ANN Functions
















          Source: http://www70.homepage.villanova.edu/matthew.liberatore/Mgt2206/turban_online_ch06.pdf
          The transformation modifies the output levels to reasonable values (typically between 0 and 1).
          This transformation is performed before the output reaches the next level. Without such a
          transformation, the value of the output becomes very large, especially when there are several
          layers of neurons. Sometimes, instead of a transformation function, a threshold value is used.
          A threshold value is a hurdle value for the output of a neuron to trigger the next level of
          neurons. If an output value is smaller than the threshold value, it will not be passed to the next
          level of neurons.


                 Example: Any value of 0.5 or less becomes 0, and any value above 0.5 becomes 1.
          A transformation can occur at the output of each processing element, or it can be performed only
          at the final output nodes.

          Hidden Layers: Complex practical applications require one or more hidden layers between the
          input and output neurons and a correspondingly large number of weights. Many commercial
          ANN include three and sometimes up to five layers, with each containing 10 to 1,000 processing
          elements. Some experimental ANN use millions of processing elements. Because each layer
          increases the training effort exponentially and also increases the computation required, the use
          of more than three hidden layers is rare in most commercial systems.




              Task  Compare and contrast summation function and transformation function.

          11.1.5 Architecture

          There are several effective neural network models and algorithms. Some of the most common
          are back propagation (or feedforward), associative memory, and the recurrent network.






                                           LOVELY PROFESSIONAL UNIVERSITY                                   183
   185   186   187   188   189   190   191   192   193   194   195