To search, Click below search items.


All Published Papers Search Service


Credit Scoring Model Based on Back Propagation Neural Network Using Various Activation and Error Function


Mulhim Al Doori, Bassam Beyrouti


Vol. 14  No. 3  pp. 16-24


The Back Propagation algorithm of Neural Networks is a widely used learning technique for training a multi layered perceptron network. The algorithm applies error propagation from outputs to inputs and gradually fine tunes the network weights to minimize the sum of error using the gradient descent technique. Activation functions are employed at each neuron level to provide non-linearity to the network. In this paper, an attempt has been made to assess and compare the results using a combination of activation and error functions applied differently on the hidden and output layers of the network. Sigmoid, Hyperbolic Tangent and Gaussian are the activation functions under study. Furthermore, error functions such as the Mean Squared Error, Huber, and the Complex Sine-Hyperbolic have been considered


Artificial Neural Network, Back Propagation Algorithm, Activation Function, Error Function