Issue | #Downvotes for this reason | By |
---|
estimators | Invoking the ``fit`` method on the ``VotingClassifier`` will fit clones of those original estimators that will be stored in the class attribute ``self.estimators_``. An estimator can be set to ``'drop'`` using :meth:`set_params` .. versionchanged:: 0.21 ``'drop'`` is accepted. Using None was deprecated in 0.22 and support was removed in 0.24 voting : {'hard', 'soft'}, default='hard' If 'hard', uses predicted class labels for majority rule voting Else if 'soft', predicts the class label based on the argmax of the sums of the predicted probabilities, which is recommended for an ensemble of well-calibrated classifiers | default: [{"oml-python:serialized_object": "component_reference", "value": {"key": "dt", "step_name": "dt"}}] |
flatten_transform | Affects shape of transform output only when voting='soft' If voting='soft' and flatten_transform=True, transform method returns matrix with shape (n_samples, n_classifiers * n_classes). If flatten_transform=False, it returns (n_classifiers, n_samples, n_classes) | default: true |
n_jobs | The number of jobs to run in parallel for ``fit``
``None`` means 1 unless in a :obj:`joblib.parallel_backend` context
``-1`` means using all processors. See :term:`Glossary | default: null |
verbose | If True, the time elapsed while fitting will be printed as it is completed .. versionadded:: 0.23 | default: false |
voting | default: "hard" | |
weights | Sequence of weights (`float` or `int`) to weight the occurrences of predicted class labels (`hard` voting) or class probabilities before averaging (`soft` voting). Uses uniform weights if `None` | default: null |