| |

Backward Propagation in Artificial Neural Network

Backpropagation is the tool of neural network training. It is a way to adjust the neural network weights based on the error rate obtained in the previous iteration. Proper adjustment of weights can reduce the error rate and increase the generalizability of the model.Backpropagation in neural networks is a shortened form of “error propagation”. This is the standard method for training artificial neural networks. This method helps to calculate the slope of the loss function for all weights in the network. The neural network backpropagation algorithms use…

Gradient Descent (now with a little bit of scary maths)
| |

Gradient Descent (now with a little bit of scary maths)

Buckle up Buckaroo because Gradient Descent is gonna be a long one (and a tricky one too). The whole article would be a lot more “mathy” than most articles as it tries to cover the concepts behind a Machine Learning algorithm called Linear Regression. If you don’t know what Linear Regression is, go through this article once. It would help…