Quantile Loss & Quantile Regression | by Vyacheslav Efimov

Discover ways to modify regression algorithms to foretell any quantile of information

Regression is a machine studying process the place the objective is to foretell an actual worth based mostly on a set of characteristic vectors. There exists a big number of regression algorithms: linear regression, logistic regression, gradient boosting or neural networks. Throughout coaching, every of those algorithms adjusts the weights of a mannequin based mostly on the loss operate used for optimization.

The selection of a loss operate is dependent upon a sure process and specific values of a metric required to realize. Many loss capabilities (like MSE, MAE, RMSLE and so on.) deal with predicting the anticipated worth of a variable given a characteristic vector.

On this article, we’ll take a look at a particular loss operate known as quantile loss used to foretell specific variable quantiles. Earlier than diving into the main points of quantile loss, allow us to briefly revise the time period of a quantile.

Quantile qₐ is a price that divides a given set of numbers in a manner at which α * 100% of numbers are lower than the worth and (1 — α) * 100% of numbers are larger than the worth.

Quantiles qₐ for α = 0.25, α = 0.5 and α = 0.75 are sometimes utilized in statistics and known as quartiles. These quartiles are denoted as Q₁, Q₂ and Q₃ respectively. Three quartiles break up information into 4 equal components.

Equally, there are percentiles p which divide a given set of numbers by 100 equal components. A percentile is denoted as pₐ the place α is the share of numbers lower than the corresponding worth.

Quartiles Q₁, Q₂ and Q₃ correspond to percentiles p₂₅, p₅₀ and p₇₅ respectively.

Within the instance under, for a given set of numbers, all three quartiles are discovered.

An instance exhibiting all three quartiles for a given set of numbers. The primary quartile Q₁ is the same as 10 as a result of 25% of values are lower than 10 and 75% of values are larger than 10. The analogy proceeds to different quartiles.

Machine studying algorithms aiming to foretell a selected variable quantile use quantile loss because the loss operate. Earlier than going to the formulation, allow us to take into account a easy instance.

Think about an issue the place the objective is to foretell the 75-th percentile of a variable. Actually, this assertion is equal to the one which prediction errors need to be destructive in 75% of circumstances and within the different 25% to be constructive. That’s the precise instinct used behind the quantile loss.


The quantile loss method is illustrated under. The α parameter refers back to the quantile which must be predicted.

Quantile loss method

The worth of quantile loss is dependent upon whether or not a prediction is much less or larger than the true worth. To grasp higher the logic behind it, allow us to suppose we goal is to foretell the 80-th quantile, thus the worth of α = 0.8 is plugged into the equations. Because of this, the method appears to be like like this:

Principally, in such a case, the quantile loss penalizes under-estimated predictions 4 occasions greater than over-estimated. This manner the mannequin will probably be extra vital to under-estimated errors and can predict increased values extra usually. Because of this, the fitted mannequin on common will over-estimate outcomes roughly in 80% of circumstances and in 20% it is going to produce under-estimated.

Proper now assume that two predictions for a similar goal had been obtained. The goal has a price of 40, whereas the predictions are 30 and 50. Allow us to calculate the quantile loss in each circumstances. Even supposing absolutely the error of 10 is similar in each circumstances, the loss worth is completely different:

  • for 30, the loss worth is l = 0.8 * 10 = 8
  • for 50, the loss worth is l = 0.2 * 10 = 2.

This loss operate is illustrated within the diagram under which exhibits loss values for various parameters of α when the true worth is 40.

Inversely, if the worth of α was 0.2, then over-estimated predictions could be penalized 4 occasions greater than the under-estimated.

The issue of predicting a sure variable quantile is named quantile regression.

Allow us to create an artificial dataset with 10 000 samples the place rankings of gamers in a online game will probably be estimated based mostly on the variety of enjoying hours.

Dataset era
Scatter plot between the predictor (hours) and the goal (ranking)

Allow us to break up the info on prepare and check in 80:20 proportion:

Splitting dataset in 80:20 proportion

For comparability, allow us to construct 3 regression fashions with completely different α values: 0.2, 0.5 and 0.8. Every of the regression fashions will probably be created by LightGBM — a library with an environment friendly implementation of gradient boosting.

Primarily based on the knowledge from the official documentation, LightGBM permits fixing quantile regression issues by specifying the goal parameter as ‘quantile’ and passing a corresponding worth of alpha.

After coaching 3 fashions, they can be utilized to acquire predictions (line 6).

Coaching LGBM fashions with goal = ‘quantile’

Allow us to visualize the predictions through the code snippet under:

Scatter plot between the predictor (hours) and the true / predicted goal values

From the scatter plot above, it’s clear that with larger values of α, fashions are inclined to generate extra over-estimated outcomes. Moreover, allow us to evaluate the predictions of every mannequin with all goal values.

Comparability of predictions achieved by completely different fashions

This results in the next output:

The sample from the output is clearly seen: for any α, predicted values are larger than true values in roughly α * 100% of circumstances. Due to this fact, we are able to experimentally conclude that our prediction fashions work accurately.

Prediction errors of quantile regression fashions are destructive roughly in α * 100% of circumstances and are constructive in (1 — α) * 100% of circumstances.

We now have found quantile loss — a versatile loss operate that may be included into any regression mannequin to foretell a sure variable quantile. Primarily based on the instance of LightGBM, we noticed find out how to modify a mannequin, so it solves a quantile regression drawback. Actually, many different common machine studying libraries enable setting quantile loss as a loss operate.

The code used on this article is out there here:

All photographs except in any other case famous are by the creator.

Machine Studying in a Non-Euclidean Area | by Mastafa Foufa | Jun, 2023

The Impression of the Diffuser and Bathe Screens on Espresso | by Robert McKeon Aloe | Jun, 2023