A Python script to generate random synthetic mixtures from pure spectra from a database. This post can be used as a tutorial to generate datasets for Multivariate Curve Resolution.
A short survey of diagnostic plots, as tools to dig deeper on the assumptions behind a regression model.
A implementation of a genetic algorithm for wavelength selection using basic Numpy functions.
A implementation of robust PCA, useful when the data contains outliers.
Worked examples of two methods for baseline correction of spectra applied to Raman and XRF data.
We bring in the concept of Fourier ring correlation for another riff on the optimal settings of smoothing filters.
Locally weighted regression is an approach for smoothing a set of data points by performing a polynomial regression in the vicinity of each point. It does sounds like a …
Wavelength selection methods aim at choosing the spectral bands that produce the best regression or classification model. Here we introduce a genetic algorithm for wavelength selection.
Multi-class classification aims and subdividing samples into one of multiple predefined categories. In this post we explore a basic classifier and discuss important metrics such as accuracy and AUC …
Updated code and additional utility scripts for PLS regression. Will keep it updated as we go.
Multivariate Curve Resolution deals with spectra, or other signals, from samples containing multiple components, and aims at recovering the pure components.
On the concept of wavelet transform and decomposition, and its application for denoising of spectra.
An introductory tutorial on optimisers for deep learning, including Python code for a regression training for NIR spectroscopy.
The Kennard-Stone algorithm is a criterion to divide a dataset into a calibration and a validation subset. Here's an introduction to the method and a basic Python implementation.
NIR spectroscopy can detect sources of allergens or intolerance in food. Here's an example of detecting lactose in lactose-free milk.
The PCA correlation circle is a useful tool to visually display the correlation between spectral bands and principal components. The correlation can be quantified through the Euclidean distance and …
The process of developing and optimising a regression model, almost invariably requires a sequence of steps. These steps can be combined in a single predictor using the Pipeline function …
Can we use NIR spectroscopy to distinguish between coffee types? Let's take a look at how basic PCA can enable a qualitative estimation of coffee strength.
Aquagrams are diagrams charting the relative strength of the NIR reflectance or absorbance of specific bands in water-based systems. Here's an example code to plot an aquagram using Matplotlib.
Using parallel computation to speed up cross-validation analysis for large data sets.
Where we discuss the meaning of an activation function in neural networks, discuss a few examples, and show a comparison of neural network training with different activation functions.
This post introduces basic Python code to build fully-connected deep neural networks with TensorFlow for regression analysis of spectral data.
The Akaike Information Criterion (AIC) is another tool to compare prediction models. AIC combines model accuracy and parsimony in a single metric and can be used to evaluate data …
What is the minimum amount of information required to export and re-use a linear regression model? The answer is surprisingly simple. Here's a step by step example using PLS …
Backward Variable Selection for PLS regression is a method to discard variables that contribute poorly to the regression model. Here's a Python implementation of the method.
The Concordance Correlation Coefficient (CCC) can be useful to quantify the quality of a linear regression model. In this tutorial we explain the CCC and describe its relation with …
Bias-Variance trade-off refers to the optimal choice of parameters in a model in order to avoid both overfitting and underfitting. Let's look at a worked example using PLS regression.
Improve the performance of a PLS method by wavelength band selection using Simulated Annealing optimisation.
Gain a practical understanding of PCA and kernel PCA by learning to code the algorithms and test it on real spectroscopic data.
The perceptron is a basic block of feed-forward neural networks. Learn how to use a single perceptron for binary classification of NIR spectra using gradient descent
PLS Discriminant analysis is a variation of PLS able to deal with classification problems. Here's a tutorial on binary classification with PLS-DA in Python
Simulated annealing helps overcome some of the shortcomings of greedy algorithms. Here's a tutorial on simulated annealing for principal components selection in regression.
Greedy algorithms are commonly used to optimise a function over a parameter space. Here's an implementation of a greedy algorithm for principal components selection in regression.
Setting the parameters of a Savitzky-Golay filter seems more a craft than a science. Here's my method to find an optimal filter, complete with code.
Not all wavelengths are created equals. A moving window PLS algorithm optimises the regression by discarding bands that are not useful for prediction.
Cross-validation is a standard procedure to quantify the robustness of a regression model. Compare K-Fold, Montecarlo and Bootstrap methods and learn some neat trick in the process.
Meet a fairly unknown member of the spectral smoothing family: the Fourier spectral smoothing method. Learn some theory and Python code implementation.
The secret behind perfect smoothing is a wise choice of parameters. In this tutorial you will learn about the Savitzky–Golay method and the way to optimise its performance
Want to get more out of your principal components regression? Here's a simple hack that will give you a stunning improvement on the performance of PCR.
How do we make sure we are detecting only true outliers and not cherry-picking from the data? Here's a method based on the Mahalanobis distance with PCA.
Exploratory analysis is an essential part of data analysis. Learn a handy way to explore your dataset with NIR data correlograms with Seaborn in Python.
What is Linear Discriminant Analysis and how it differs from PCA? Let's talk trough LDA and build a NIR spectra classifier using LDA in Python.
Principal components regression is a staple of NIR analysis. Ridge regression is much used of machine learning. How do they relate? Find out in this post.
Not every data point is created equal. In this post we'll show how to perform outliers detection with PLS regression for NIR spectroscopy in Python.
Three methods to export a Python NIR regression model and how to load it back for future use. Worked Python codes to discuss pros and cons of these methods.
Worked example of two scatter correction techniques for NIR spectroscopy in Python: Multiplicative Scatter Correction and Standard Normal Variate.
Improve the quality of your PLS regression using variable selection. This tutorial will work through a variable selection method for PLS in Python.
Step by step tutorial on how to build a NIR calibration model using Partial Least Squares Regression in Python.
An in-depth introduction to Principal Component Regression in Python using NIR data. PCR is the combination of PCA with linear regression. Check it out.
An in-depth tutorial on how to run a classification of NIR spectra using Principal Component Analysis in Python. Step by step example with code.
Can we use NIR analysis to grade macadamias? Check out our preliminary results of NIR classification of macadamia kernels using Principal Component Analysis.
A worked example for an introduction to Principal Component Analysis in Python.