Skip to content

Commit 81f047c

Browse files
authored
Merge branch 'main' into main
2 parents d23389a + 225d5b1 commit 81f047c

File tree

2 files changed

+75
-0
lines changed

2 files changed

+75
-0
lines changed
+71
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
# Grid Search
2+
3+
Grid Search is a hyperparameter tuning technique in Machine Learning that helps to find the best combination of hyperparameters for a given model. It works by defining a grid of hyperparameters and then training the model with all the possible combinations of hyperparameters to find the best performing set.
4+
5+
The Grid Search Method considers some hyperparameter combinations and selects the one returning a lower error score. This method is specifically useful when there are only some hyperparameters in order to optimize. However, it is outperformed by other weighted-random search methods when the Machine Learning model grows in complexity.
6+
7+
## Implementation
8+
9+
Before applying Grid Searching on any algorithm, data is divided into training and validation set, a validation set is used to validate the models. A model with all possible combinations of hyperparameters is tested on the validation set to choose the best combination.
10+
11+
Grid Searching can be applied to any hyperparameters algorithm whose performance can be improved by tuning hyperparameter. For example, we can apply grid searching on K-Nearest Neighbors by validating its performance on a set of values of K in it. Same thing we can do with Logistic Regression by using a set of values of learning rate to find the best learning rate at which Logistic Regression achieves the best accuracy.
12+
13+
Let us consider that the model accepts the below three parameters in the form of input:
14+
1. Number of hidden layers `[2, 4]`
15+
2. Number of neurons in every layer `[5, 10]`
16+
3. Number of epochs `[10, 50]`
17+
18+
If we want to try out two options for every parameter input (as specified in square brackets above), it estimates different combinations. For instance, one possible combination can be `[2, 5, 10]`. Finding such combinations manually would be a headache.
19+
20+
Now, suppose that we had ten different parameters as input, and we would like to try out five possible values for each and every parameter. It would need manual input from the programmer's end every time we like to alter the value of a parameter, re-execute the code, and keep a record of the outputs for every combination of the parameters.
21+
22+
Grid Search automates that process, as it accepts the possible value for every parameter and executes the code in order to try out each and every possible combination outputs the result for the combinations and outputs the combination having the best accuracy.
23+
24+
Higher values of C tell the model, the training data resembles real world information, place a greater weight on the training data. While lower values of C do the opposite.
25+
26+
## Explaination of the Code
27+
28+
The code provided performs hyperparameter tuning for a Logistic Regression model using a manual grid search approach. It evaluates the model's performance for different values of the regularization strength hyperparameter C on the Iris dataset.
29+
1. datasets from sklearn is imported to load the Iris dataset.
30+
2. LogisticRegression from sklearn.linear_model is imported to create and fit the logistic regression model.
31+
3. The Iris dataset is loaded, with X containing the features and y containing the target labels.
32+
4. A LogisticRegression model is instantiated with max_iter=10000 to ensure convergence during the fitting process, as the default maximum iterations (100) might not be sufficient.
33+
5. A list of different values for the regularization strength C is defined. The hyperparameter C controls the regularization strength, with smaller values specifying stronger regularization.
34+
6. An empty list scores is initialized to store the model's performance scores for different values of C.
35+
7. A for loop iterates over each value in the C list:
36+
8. logit.set_params(C=choice) sets the C parameter of the logistic regression model to the current value in the loop.
37+
9. logit.fit(X, y) fits the logistic regression model to the entire Iris dataset (this is typically done on training data in a real scenario, not the entire dataset).
38+
10. logit.score(X, y) calculates the accuracy of the fitted model on the dataset and appends this score to the scores list.
39+
11. After the loop, the scores list is printed, showing the accuracy for each value of C.
40+
41+
### Python Code
42+
43+
```python
44+
from sklearn import datasets
45+
from sklearn.linear_model import LogisticRegression
46+
47+
iris = datasets.load_iris()
48+
X = iris['data']
49+
y = iris['target']
50+
51+
logit = LogisticRegression(max_iter = 10000)
52+
53+
C = [0.25, 0.5, 0.75, 1, 1.25, 1.5, 1.75, 2]
54+
55+
scores = []
56+
for choice in C:
57+
logit.set_params(C=choice)
58+
logit.fit(X, y)
59+
scores.append(logit.score(X, y))
60+
print(scores)
61+
```
62+
63+
#### Results
64+
65+
```
66+
[0.9666666666666667, 0.9666666666666667, 0.9733333333333334, 0.9733333333333334, 0.98, 0.98, 0.9866666666666667, 0.9866666666666667]
67+
```
68+
69+
We can see that the lower values of `C` performed worse than the base parameter of `1`. However, as we increased the value of `C` to `1.75` the model experienced increased accuracy.
70+
71+
It seems that increasing `C` beyond this amount does not help increase model accuracy.

contrib/machine-learning/index.md

+4
Original file line numberDiff line numberDiff line change
@@ -10,4 +10,8 @@
1010
- [PyTorch.md](pytorch.md)
1111
- [Types of optimizers](Types_of_optimizers.md)
1212
- [Logistic Regression](logistic-regression.md)
13+
1314
-[Types_of_Cost_Functions](Types_of_Cost_Functions.md)
15+
16+
- [Grid Search](grid-search.md)
17+

0 commit comments

Comments
 (0)