Gradient descent is an optimization algorithm that is commonly used to train machine learning models and neural networks. Training data helps these models learn over time, and the cost function within gradient descent specifically acts as a barometer, gauging its accuracy with each iteration of parameter updates. IBM Master Inventor Martin Keen explains.
- Introducing the Cloud Spanner free trial instances
- 9th Cyber & SCADA Security for Power and Utilities 2022