Why backpropogation is not enough!!

Backpropagation is a widely used algorithm for training artificial neural networks, but it is not a complete solution on its own. There are several reasons why backpropagation is not enough:

  1. Local minima: Backpropagation can get stuck in local minima, where the error is not zero, but it is not possible to decrease the error further by adjusting the weights. This can be a problem because the global minimum, which is the lowest possible error, may not be found.

To address these issues, researchers have developed various techniques such as weight initialization, batch normalization, and optimizers like Adam, which can improve the performance of backpropagation and make it more effective for training neural networks.

There are several alternative algorithms for training artificial neural networks, including:

  1. Nelder-Mead: This is an optimization algorithm that uses a simplex search method to find the optimal solution. It does not require the computation of gradients, but it can be slower to converge than gradient-based algorithms.

--

--

2x Kaggle Expert, Data Scientist working with Sequence Modelling for Time-Series and NLP, and Bioinformatics Researcher @BENGURIONU

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Minesh A. Jethva

2x Kaggle Expert, Data Scientist working with Sequence Modelling for Time-Series and NLP, and Bioinformatics Researcher @BENGURIONU