You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -282,7 +282,7 @@ You can also find some winning stacking architectures on [Kaggle blog](http://bl
282
282
283
283
***Note:*** Always remember that higher number of levels or models does NOT guarantee better result. The key to success in stacking (blending) is diversity - low correlation between models.
284
284
285
-
For some example configurations see [Q16](https://github.com/vecxoz/vecstack#16-how-many-models-should-i-use-on-a-given-level)
285
+
For some example configurations see [Q16](https://github.com/vecxoz/vecstack#16-how-many-models-should-i-use-on-a-given-stacking-level)
286
286
287
287
### 18. How do I choose models for stacking?
288
288
@@ -354,15 +354,15 @@ You can find out only by experiment. Default choice is variant ***A***, because
354
354
355
355
### 25. How to choose number of folds?
356
356
357
-
***Note:*** Remember that higher number of folds substantially increase training time (and RAM consumption for StackingTransformer). See [Q23](https://github.com/vecxoz/vecstack#23-how-to-estimate-training-time-and-number-of-models-which-will-be-built).
357
+
***Note:*** Remember that higher number of folds substantially increase training time (and RAM consumption for StackingTransformer). See [Q23](https://github.com/vecxoz/vecstack#23-how-to-estimate-stacking-training-time-and-number-of-models-which-will-be-built).
358
358
359
359
* Standard approach: 4 or 5 folds.
360
360
* If data is big: 3 folds.
361
361
* If data is small: you can try more folds like 10 or so.
362
362
363
363
### 26. When I transform train set I see 'Train set was detected'. What does it mean?
364
364
365
-
Due to its nature stacking procedure treats train set and any other set differently. It means that transformation is different for train set and any other set. So if you are transforming `X_train` and see 'Train set was detected' everything is OK. If you meant to transform train set but you don't see this message then something went wrong. Possibly your train set was changed (it is not allowed). In this case you have to retrain `StackingTransformer`. For more details see [stacking tutorial](https://github.com/vecxoz/vecstack/blob/master/examples/00_stacking_concept_pictures_code.ipynb) or [Q8](https://github.com/vecxoz/vecstack#8-why-do-i-need-complicated-stacking-procedure)
365
+
Due to its nature stacking procedure treats train set and any other set differently. It means that transformation is different for train set and any other set. So if you are transforming `X_train` and see 'Train set was detected' everything is OK. If you meant to transform train set but you don't see this message then something went wrong. Possibly your train set was changed (it is not allowed). In this case you have to retrain `StackingTransformer`. For more details see [stacking tutorial](https://github.com/vecxoz/vecstack/blob/master/examples/00_stacking_concept_pictures_code.ipynb) or [Q8](https://github.com/vecxoz/vecstack#8-why-do-i-need-complicated-inner-procedure-for-stacking)
366
366
367
367
***Note 1:*** It is NOT allowed to (substantially) change train set after training on it.
368
368
***Note 2:*** To be correctly detected train set does not necessarily have to be identical (exactly the same). It must have the same shape and all values must be *close* (`np.isclose` is used for checking). So if you somehow regenerate your train set you should not worry.
0 commit comments