Mini-Batch and Gradient Descent
Mini-Batch and Gradient Descent
Mini-Batch and Gradient Descent
make_batches(batch_size):
iterator=0
while iterator<training_size:
temp_batch = range(start,start+batch_size)
X_batch = X_train[temp_batch]
y_batch = y_train[temp_batch]
yield X_batch,y_batch
iterator += batch_size
batches = list(make_batches(batch_size=.....))
2. Training
model = linear_model.SGDRegressor(max_iter=....,tol=....,eta0=....,average=batch_size)
NOTE
1. Use batch_size as 50 or 100 according to convenience.
2. Max_iter, tol and eta0 are parameters that need to be tuned.
Gradient Descent
As already discussed there is no direct API/Class in scikit-learn to implement this algorithm. So,
we use the mini-batch approach with batch_size as the entire dataset i.e, X_train.shape[0]