Gradient Descent is an optimization algorithm used to minimize errors by adjusting model parameters iteratively. Used in training machine learning and deep learning models.
Example:
A neural network uses gradient descent to reduce prediction error over epochs.