Flow
Imputation transformer for completing missing values.
0 runs0 likes0 downloads0 reach0 impact
Encode categorical features as a one-hot numeric array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features.…
0 runs0 likes0 downloads0 reach0 impact
Imputation transformer for completing missing values.
0 runs0 likes0 downloads0 reach0 impact
Encode categorical features as a one-hot numeric array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features.…
0 runs0 likes0 downloads0 reach0 impact
Encode categorical features as a one-hot numeric array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features.…
0 runs0 likes0 downloads0 reach0 impact
Standardize features by removing the mean and scaling to unit variance. The standard score of a sample `x` is calculated as: z = (x - u) / s where `u` is the mean of the training samples or zero if…
0 runs0 likes0 downloads0 reach0 impact
Concatenates results of multiple transformer objects. This estimator applies a list of transformer objects in parallel to the input data, then concatenates the results. This is useful to combine…
0 runs0 likes0 downloads0 reach0 impact
Dimensionality reduction using truncated SVD (aka LSA). This transformer performs linear dimensionality reduction by means of truncated singular value decomposition (SVD). Contrary to PCA, this…
0 runs0 likes0 downloads0 reach0 impact
Select features according to a percentile of the highest scores.
0 runs0 likes0 downloads0 reach0 impact
An AdaBoost classifier. An AdaBoost [1] classifier is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
DummyClassifier makes predictions that ignore the input features. This classifier serves as a simple baseline to compare against other more complex classifiers. The specific behavior of the baseline…
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
A Bagging classifier. A Bagging classifier is an ensemble meta-estimator that fits base classifiers each on random subsets of the original dataset and then aggregate their individual predictions…
0 runs0 likes0 downloads0 reach0 impact
Ordinary least squares Linear Regression. LinearRegression fits a linear model with coefficients w = (w1, ..., wp) to minimize the residual sum of squares between the observed targets in the dataset,…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
Encode categorical features as a one-hot numeric array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features.…
0 runs0 likes0 downloads0 reach0 impact
DummyClassifier makes predictions that ignore the input features. This classifier serves as a simple baseline to compare against other more complex classifiers. The specific behavior of the baseline…
0 runs0 likes0 downloads0 reach0 impact
Standardize features by removing the mean and scaling to unit variance. The standard score of a sample `x` is calculated as: z = (x - u) / s where `u` is the mean of the training samples or zero if…
0 runs0 likes0 downloads0 reach0 impact
Concatenates results of multiple transformer objects. This estimator applies a list of transformer objects in parallel to the input data, then concatenates the results. This is useful to combine…
0 runs0 likes0 downloads0 reach0 impact
Dimensionality reduction using truncated SVD (aka LSA). This transformer performs linear dimensionality reduction by means of truncated singular value decomposition (SVD). Contrary to PCA, this…
0 runs0 likes0 downloads0 reach0 impact
Select features according to a percentile of the highest scores.
0 runs0 likes0 downloads0 reach0 impact
An AdaBoost classifier. An AdaBoost [1] classifier is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
Imputation transformer for completing missing values.
0 runs0 likes0 downloads0 reach0 impact
Ordinary least squares Linear Regression. LinearRegression fits a linear model with coefficients w = (w1, ..., wp) to minimize the residual sum of squares between the observed targets in the dataset,…
0 runs0 likes0 downloads0 reach0 impact
Soft Voting/Majority Rule classifier for unfitted estimators.
0 runs0 likes0 downloads0 reach0 impact
Soft Voting/Majority Rule classifier for unfitted estimators.
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
test description
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
Ordinary least squares Linear Regression. LinearRegression fits a linear model with coefficients w = (w1, ..., wp) to minimize the residual sum of squares between the observed targets in the dataset,…
0 runs0 likes0 downloads0 reach0 impact
An AdaBoost classifier. An AdaBoost [1] classifier is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset…
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
An AdaBoost classifier. An AdaBoost [1] classifier is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset…
0 runs0 likes0 downloads0 reach0 impact
Randomized search on hyper parameters. RandomizedSearchCV implements a "fit" and a "score" method. It also implements "score_samples", "predict", "predict_proba", "decision_function", "transform" and…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
Ordinary least squares Linear Regression. LinearRegression fits a linear model with coefficients w = (w1, ..., wp) to minimize the residual sum of squares between the observed targets in the dataset,…
0 runs0 likes0 downloads0 reach0 impact
Ordinary least squares Linear Regression. LinearRegression fits a linear model with coefficients w = (w1, ..., wp) to minimize the residual sum of squares between the observed targets in the dataset,…
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
Feature selector that removes all low-variance features. This feature selection algorithm looks only at the features (X), not the desired outputs (y), and can thus be used for unsupervised learning.
0 runs0 likes0 downloads0 reach0 impact
A Bagging classifier. A Bagging classifier is an ensemble meta-estimator that fits base classifiers each on random subsets of the original dataset and then aggregate their individual predictions…
0 runs0 likes0 downloads0 reach0 impact
Encode categorical features as an integer array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features. The…
0 runs0 likes0 downloads0 reach0 impact
Concatenates results of multiple transformer objects. This estimator applies a list of transformer objects in parallel to the input data, then concatenates the results. This is useful to combine…
0 runs0 likes0 downloads0 reach0 impact
Dimensionality reduction using truncated SVD (aka LSA). This transformer performs linear dimensionality reduction by means of truncated singular value decomposition (SVD). Contrary to PCA, this…
0 runs0 likes0 downloads0 reach0 impact
Select features according to a percentile of the highest scores.
0 runs0 likes0 downloads0 reach0 impact
An AdaBoost classifier. An AdaBoost [1] classifier is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
A Bagging classifier. A Bagging classifier is an ensemble meta-estimator that fits base classifiers each on random subsets of the original dataset and then aggregate their individual predictions…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
A Bagging classifier. A Bagging classifier is an ensemble meta-estimator that fits base classifiers each on random subsets of the original dataset and then aggregate their individual predictions…
0 runs0 likes0 downloads0 reach0 impact
Imputation transformer for completing missing values.
0 runs0 likes0 downloads0 reach0 impact
Feature selector that removes all low-variance features. This feature selection algorithm looks only at the features (X), not the desired outputs (y), and can thus be used for unsupervised learning.
0 runs0 likes0 downloads0 reach0 impact
A Bagging classifier. A Bagging classifier is an ensemble meta-estimator that fits base classifiers each on random subsets of the original dataset and then aggregate their individual predictions…
0 runs0 likes0 downloads0 reach0 impact
C-Support Vector Classification. The implementation is based on libsvm. The fit time scales at least quadratically with the number of samples and may be impractical beyond tens of thousands of…
0 runs0 likes0 downloads0 reach0 impact
An AdaBoost classifier. An AdaBoost [1] classifier is a meta-estimator that begins by fitting a classifier on the original dataset and then fits additional copies of the classifier on the same dataset…
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
Ordinary least squares Linear Regression. LinearRegression fits a linear model with coefficients w = (w1, ..., wp) to minimize the residual sum of squares between the observed targets in the dataset,…
0 runs0 likes0 downloads0 reach0 impact
Randomized search on hyper parameters. RandomizedSearchCV implements a "fit" and a "score" method. It also implements "score_samples", "predict", "predict_proba", "decision_function", "transform" and…
0 runs0 likes0 downloads0 reach0 impact
A Bagging classifier. A Bagging classifier is an ensemble meta-estimator that fits base classifiers each on random subsets of the original dataset and then aggregate their individual predictions…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
A Bagging classifier. A Bagging classifier is an ensemble meta-estimator that fits base classifiers each on random subsets of the original dataset and then aggregate their individual predictions…
0 runs0 likes0 downloads0 reach0 impact
Encode categorical features as a one-hot numeric array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features.…
0 runs0 likes0 downloads0 reach0 impact
Standardize features by removing the mean and scaling to unit variance. The standard score of a sample `x` is calculated as: z = (x - u) / s where `u` is the mean of the training samples or zero if…
0 runs0 likes0 downloads0 reach0 impact
Imputation transformer for completing missing values.
0 runs0 likes0 downloads0 reach0 impact
Feature selector that removes all low-variance features. This feature selection algorithm looks only at the features (X), not the desired outputs (y), and can thus be used for unsupervised learning.
0 runs0 likes0 downloads0 reach0 impact
Randomized search on hyper parameters. RandomizedSearchCV implements a "fit" and a "score" method. It also implements "predict", "predict_proba", "decision_function", "transform" and…
0 runs0 likes0 downloads0 reach0 impact
A decision tree classifier.
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
Ordinary least squares Linear Regression. LinearRegression fits a linear model with coefficients w = (w1, ..., wp) to minimize the residual sum of squares between the observed targets in the dataset,…
0 runs0 likes0 downloads0 reach0 impact
Univariate imputer for completing missing values with simple strategies. Replace missing values using a descriptive statistic (e.g. mean, median, or most frequent) along each column, or using a…
0 runs0 likes0 downloads0 reach0 impact
DummyClassifier makes predictions that ignore the input features. This classifier serves as a simple baseline to compare against other more complex classifiers. The specific behavior of the baseline…
0 runs0 likes0 downloads0 reach0 impact
Encode categorical features as a one-hot numeric array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features.…
0 runs0 likes0 downloads0 reach0 impact
Standardize features by removing the mean and scaling to unit variance The standard score of a sample `x` is calculated as: z = (x - u) / s where `u` is the mean of the training samples or zero if…
0 runs0 likes0 downloads0 reach0 impact
Concatenates results of multiple transformer objects. This estimator applies a list of transformer objects in parallel to the input data, then concatenates the results. This is useful to combine…
0 runs0 likes0 downloads0 reach0 impact
Dimensionality reduction using truncated SVD (aka LSA). This transformer performs linear dimensionality reduction by means of truncated singular value decomposition (SVD). Contrary to PCA, this…
0 runs0 likes0 downloads0 reach0 impact
Select features according to a percentile of the highest scores.
0 runs0 likes0 downloads0 reach0 impact
Imputation transformer for completing missing values.
0 runs0 likes0 downloads0 reach0 impact
DummyClassifier is a classifier that makes predictions using simple rules. This classifier is useful as a simple baseline to compare with other (real) classifiers. Do not use it for real problems.
0 runs0 likes0 downloads0 reach0 impact
Standardize features by removing the mean and scaling to unit variance. The standard score of a sample `x` is calculated as: z = (x - u) / s where `u` is the mean of the training samples or zero if…
0 runs0 likes0 downloads0 reach0 impact
DummyClassifier makes predictions that ignore the input features. This classifier serves as a simple baseline to compare against other more complex classifiers. The specific behavior of the baseline…
0 runs0 likes0 downloads0 reach0 impact
Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit…
0 runs0 likes0 downloads0 reach0 impact
Encode categorical features as a one-hot numeric array. The input to this transformer should be an array-like of integers or strings, denoting the values taken on by categorical (discrete) features.…
0 runs0 likes0 downloads0 reach0 impact
Standardize features by removing the mean and scaling to unit variance. The standard score of a sample `x` is calculated as: z = (x - u) / s where `u` is the mean of the training samples or zero if…
0 runs0 likes0 downloads0 reach0 impact
Concatenates results of multiple transformer objects. This estimator applies a list of transformer objects in parallel to the input data, then concatenates the results. This is useful to combine…
0 runs0 likes0 downloads0 reach0 impact
Dimensionality reduction using truncated SVD (aka LSA). This transformer performs linear dimensionality reduction by means of truncated singular value decomposition (SVD). Contrary to PCA, this…
0 runs0 likes0 downloads0 reach0 impact