"Batch" Gradient Descent (BGD or GD)

a) Updates the model parameters after each training example
b) Processes the entire training dataset in each iteration
c) Only works for online learning scenarios
d) Performs gradient descent on mini-batches of data