Sling Academy
Home/Scikit-Learn/Implementing Robust Regressors in Scikit-Learn

Implementing Robust Regressors in Scikit-Learn

Last updated: December 17, 2024

When tackling regression problems, the choice of a robust regressor can heavily influence the performance of your model, especially when dealing with datasets prone to outliers. Scikit-learn, one of the most popular machine learning libraries in Python, offers an excellent array of tools for implementing sophisticated regression models, including robust regressors. In this article, we’ll explore how to implement robust regression using Scikit-learn, focusing on algorithms like Huber Regression and Theil-Sen Regression among others.

What is a Robust Regressor?

Robust regression methods are designed to overcome the limitations of standard linear regression, which can be highly sensitive to outliers. By employing these techniques, we can achieve more reliable and stable estimations that ignore or diminish the influence of outliers.

Why Use Scikit-Learn?

Scikit-learn is well-known for its simplicity and efficiency in implementing machine learning algorithms. It offers pre-built classes, a simple API, and comprehensive documentation, making it easier for both beginners and seasoned data scientists to implement state-of-the-art algorithms without having to code them from scratch.

Huber Regression

The Huber Regression is an algorithm that combines the advantages of both least squares and absolute error criteria. It is less sensitive to outliers than traditional linear regression. In Scikit-learn, Huber Regression can be implemented using the HuberRegressor class.

from sklearn.linear_model import HuberRegressor
from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error

# Generate a dataset
X, y = make_regression(n_samples=100, n_features=1, noise=35.0, random_state=42)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize the Huber Regressor
huber = HuberRegressor()

# Fit the model
huber.fit(X_train, y_train)

# Make predictions
y_pred = huber.predict(X_test)

# Evaluate the model
mse = mean_squared_error(y_test, y_pred)
print(f'Mean Squared Error: {mse}')

Theil-Sen Regressor

The Theil-Sen Regressor is another robust method that estimates a linear model using a unique median history of slopes. This makes it more resilient to ineffectiveness caused by outliers.

from sklearn.linear_model import TheilSenRegressor

# Initialize the Theil-Sen Regressor
theil_sen = TheilSenRegressor()

# Fit the model
theil_sen.fit(X_train, y_train)

# Make predictions
y_pred_theil = theil_sen.predict(X_test)

# Evaluate the model
mse_theil = mean_squared_error(y_test, y_pred_theil)
print(f'Mean Squared Error (Theil-Sen): {mse_theil}')

RANSAC Regression

RANSAC (RANdom SAmple Consensus) is an iterative algorithm used for fitting a model to a dataset characterized by a large proportion of outliers.

from sklearn.linear_model import RANSACRegressor

# Set up a base estimator
ransac = RANSACRegressor(base_estimator=HuberRegressor())

# Fit the model
ransac.fit(X_train, y_train)

# Make predictions
ransac_pred = ransac.predict(X_test)

# Evaluate RANSAC model
ransac_mse = mean_squared_error(y_test, ransac_pred)
print(f'Mean Squared Error (RANSAC): {ransac_mse}')

Conclusion

Scikit-learn provides several robust regression techniques that allow us to model data that potentially undergo noisy disturbances or has outliers. Techniques like HuberRegressor, TheilSenRegressor, and RANSACRegressor serve as efficient solutions when dealing with atypical data points that may otherwise skew traditional linear regression results. Each method has its strengths and tradeoffs, so selecting the appropriate regressor depends heavily on the nature of your dataset and the underlying problem structure.

By incorporating robust regressors in your data analysis tasks, you can develop more adaptable and less error-prone models, leading to more accurate and reliable predictions.

Next Article: The RANSAC Algorithm for Robust Regression in Scikit-Learn

Previous Article: Bayesian Ridge Regression with Scikit-Learn

Series: Scikit-Learn Tutorials

Scikit-Learn

You May Also Like

  • Generating Gaussian Quantiles with Scikit-Learn
  • Spectral Biclustering with Scikit-Learn
  • Scikit-Learn Complete Cheat Sheet
  • ValueError: Estimator Does Not Support Sparse Input in Scikit-Learn
  • Scikit-Learn TypeError: Cannot Broadcast Due to Shape Mismatch
  • AttributeError: 'dict' Object Has No Attribute 'predict' in Scikit-Learn
  • KeyError: Missing 'param_grid' in Scikit-Learn GridSearchCV
  • Scikit-Learn ValueError: 'max_iter' Must Be Positive Integer
  • Fixing Log Function Error with Negative Values in Scikit-Learn
  • RuntimeError: Distributed Computing Backend Not Found in Scikit-Learn
  • Scikit-Learn TypeError: '<' Not Supported Between 'str' and 'int'
  • AttributeError: GridSearchCV Has No Attribute 'fit_transform' in Scikit-Learn
  • Fixing Scikit-Learn Split Error: Number of Splits > Number of Samples
  • Scikit-Learn TypeError: Cannot Concatenate 'str' and 'int'
  • ValueError: Cannot Use 'predict' Before Fitting Model in Scikit-Learn
  • Fixing AttributeError: NoneType Has No Attribute 'predict' in Scikit-Learn
  • Scikit-Learn ValueError: Cannot Reshape Array of Incorrect Size
  • LinAlgError: Matrix is Singular to Machine Precision in Scikit-Learn
  • Fixing TypeError: ndarray Object is Not Callable in Scikit-Learn