@@ -286,12 +286,17 @@ the regressor that will be used for prediction, and the transformer that will
286
286
be applied to the target variable::
287
287
288
288
>>> import numpy as np
289
- >>> from sklearn.datasets import fetch_california_housing
289
+ >>> from sklearn.datasets import make_regression
290
290
>>> from sklearn.compose import TransformedTargetRegressor
291
291
>>> from sklearn.preprocessing import QuantileTransformer
292
292
>>> from sklearn.linear_model import LinearRegression
293
293
>>> from sklearn.model_selection import train_test_split
294
- >>> X, y = fetch_california_housing(return_X_y=True)
294
+ >>> # create a synthetic dataset
295
+ >>> X, y = make_regression(n_samples=20640,
296
+ ... n_features=8,
297
+ ... noise=100.0,
298
+ ... random_state=0)
299
+ >>> y = np.exp( 1 + (y - y.min()) * (4 / (y.max() - y.min())))
295
300
>>> X, y = X[:2000, :], y[:2000] # select a subset of data
296
301
>>> transformer = QuantileTransformer(output_distribution='normal')
297
302
>>> regressor = LinearRegression()
@@ -300,11 +305,11 @@ be applied to the target variable::
300
305
>>> X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)
301
306
>>> regr.fit(X_train, y_train)
302
307
TransformedTargetRegressor(...)
303
- >>> print(' R2 score: {0:.2f}'.format( regr.score(X_test, y_test)) )
304
- R2 score: 0.61
308
+ >>> print(f" R2 score: {regr.score(X_test, y_test):.2f}" )
309
+ R2 score: 0.67
305
310
>>> raw_target_regr = LinearRegression().fit(X_train, y_train)
306
- >>> print(' R2 score: {0:.2f}'.format( raw_target_regr.score(X_test, y_test)) )
307
- R2 score: 0.59
311
+ >>> print(f" R2 score: {raw_target_regr.score(X_test, y_test):.2f}" )
312
+ R2 score: 0.64
308
313
309
314
For simple transformations, instead of a Transformer object, a pair of
310
315
functions can be passed, defining the transformation and its inverse mapping::
@@ -321,8 +326,8 @@ Subsequently, the object is created as::
321
326
... inverse_func=inverse_func)
322
327
>>> regr.fit(X_train, y_train)
323
328
TransformedTargetRegressor(...)
324
- >>> print(' R2 score: {0:.2f}'.format( regr.score(X_test, y_test)) )
325
- R2 score: 0.51
329
+ >>> print(f" R2 score: {regr.score(X_test, y_test):.2f}" )
330
+ R2 score: 0.67
326
331
327
332
By default, the provided functions are checked at each fit to be the inverse of
328
333
each other. However, it is possible to bypass this checking by setting
@@ -336,8 +341,8 @@ each other. However, it is possible to bypass this checking by setting
336
341
... check_inverse=False)
337
342
>>> regr.fit(X_train, y_train)
338
343
TransformedTargetRegressor(...)
339
- >>> print(' R2 score: {0:.2f}'.format( regr.score(X_test, y_test)) )
340
- R2 score: -1.57
344
+ >>> print(f" R2 score: {regr.score(X_test, y_test):.2f}" )
345
+ R2 score: -3.02
341
346
342
347
.. note ::
343
348
0 commit comments