Gradient Descent Explained

Gradient descent is an optimization algorithm that is commonly used to train machine learning models and neural networks. Training data helps these models learn over time, and the cost function within gradient descent specifically acts as a barometer, gauging its accuracy with each iteration of parameter updates. IBM Master Inventor Martin Keen explains.