## Parallel computation of loops for cross-validation analysis

Using parallel computation to speed up cross-validation analysis for large data sets.

Using parallel computation to speed up cross-validation analysis for large data sets.

Regression, Partial Least Squares Regression
03/13/2021

Backward Variable Selection for PLS regression is a method to discard variables that contribute poorly to the regression model. Here's a Python implementation of the method.

Bias-Variance trade-off refers to the optimal choice of parameters in a model in order to avoid both overfitting and underfitting. Let's look at a worked example using PLS regression.

Regression, Partial Least Squares Regression
08/15/2020

Improve the performance of a PLS method by wavelength band selection using Simulated Annealing optimisation.

Regression, Partial Least Squares Regression
12/07/2019

Not all wavelengths are created equals. A moving window PLS algorithm optimises the regression by discarding bands that are not useful for prediction.

Cross-validation is a standard procedure to quantify the robustness of a regression model. Compare K-Fold, Montecarlo and Bootstrap methods and learn some neat trick in the process.

Not every data point is created equal. In this post we'll show how to perform outliers detection with PLS regression for NIR spectroscopy in Python.

Partial Least Squares Regression, Regression
07/04/2018

Improve the quality of your PLS regression using variable selection. This tutorial will work through a variable selection method for PLS in Python.

Partial Least Squares Regression, Regression
06/14/2018

Step by step tutorial on how to build a NIR calibration model using Partial Least Squares Regression in Python.

NIRPY Research © 2022

## Keep in touch