ERROR BACK PROPAGATION ALGORITHM

May 4 • General • 10551 Views • 2 Comments on ERROR BACK PROPAGATION ALGORITHM

ERROR BACK PROPAGATION ALGORITHM

ERROR BACK PROPAGATION ALGORITHM (FREQUENT QUESTIONS)

1.Describe the term “local minima” in ERROR BACK PROPAGATION ALGORITHM

Error Back Propagation Algorithm works continuously based on mathematically derived formula. So some criteria or condition has to be found to stop this continuous loop. Local minima is intended for this purpose. At each loop the errors are scaled based on its gradient weight. The error with minimum gradient weight referred to as local minima is chosen and included in readings. We can’t avoid errors because it will raise anyway. So this algorithm’s idea is to choose minimum gradient weight error so that we can improve efficiency of the algorithm.

2.Explain briefly about Error Back Propagation algorithm.

ERROR BACK PROPAGATION ALGORITHMIt is an error reducing algorithm used in artificial neural networks. Artificial neural networks are networks based on the human’s nerve system. These networks contain well defined set of inputs and outputs. The network is used to describe the complex relationship between the inputs and outputs of the network. The name comes from the complexity of the network because the human’s nerves system is so much complex which is known by all of us.

       We can categorize the error propagation algorithms into two types as follows,

                                           1.Forward propagation algorithms

                                           2.Back propagation algorithms

Our concentration now is on back propagation algorithms. In this method, the traversal is from output node to various input nodes and hence called back propagation algorithm. At each node the errors are analyzed and the error with minimum gradient weight is chosen and is referred to as local minima. So at the end of the traversal we will have a set of local minimas. Again an analysis is made on the local minimas and an error is chosen which have minimum gradient weight. This final minima is called global minima. After finding the global minima the propagation process comes to end.

Tell us Your Queries, Suggestions and Feedback

Your email address will not be published.

« »