INTELLIGENT WIDGET RECONFIGURATION FOR MOBILE PHONES: MINIMAL INTELLIGENCE ALGORITHM(2)

There are a number of error correction algorithms that can be used with the MLP. These algorithms include Back Propagation, Delta rule and Perceptron. Alsmadi et al. have examined the Back Propagation, Delta rule and Perceptron algorithms and found that Back Propagation gave the best result with the MLP as it is designed to reduce the error between the actual output and the desired output in a gradient descent manner.

Beside the error correction, there are other parameters that may affect the performance of the MLP. The number of hidden layers used and the number of hidden neurons in the hidden layers will in some ways affect the performance of the neural network and the accuracy of the results. Much research has been done on this area but so far there has been no single solution to all problems on deciding the best selection of the parameters. Bishop states that an MLP with one hidden layer is sufficient to approximate any mapping to arbitrary accuracy as long as there are sufficiently large numbers of hidden neurons. However, there is an optimal number of hidden neurons to be used for different networks. Currently, there is no fixed answer to the optimal number of hidden layers and hidden neurons to be used. When there are significant processing steps to be operated on the inputs before obtaining the outputs, then there may be a benefit to having multiple hidden layers. Zurade stated that the number of hidden neurons depended on the dimension n of the input vector and on the number of M separable disjoint regions in the n-dimension Euclidean input space. He stated that there is a relationship between M, n and J (number of hidden neurons) such that
Intelligent Widget_decrypted-14
where C is a constant.