Quantile regression through linear programming

We can say that least squares linear regression corresponds to finding the mean of a single distribution. Similarly, quantile regression corresponds to finding quantiles of a single distribution. With quantile regression we obtain curves — “regression quantiles” — that together with the least squares regression curve would give a more complete picture of the distributions (the Y’s) corresponding to a set of X’s.

The Quantile Regression Problem (QRP) is formulated as a minimization of the sum of absolute differences. Further, QRP is re-formulated as a linear programming problem, which allows for efficient computation. To compute quantiles other than the median the so called “tilted” function is used. (See this Wikipedia entry.)

For a complete, interesting, and colorful introduction and justification to quantile regression I refer to this article:

Roger Koenker, Gilbert Bassett Jr., “Regression Quantiles”, Econometrica, Vol. 46, No. 1. (Jan., 1978), pp. 33-50. JSTOR URL: http://links.jstor.org/sici?sici=0012-9682%28197801%2946%3A1%3C33%3ARQ%3E2.0.CO%3B2-J .

I recently implemented and uploaded a Mathematica package for computing regression quantiles using Mathematica’s function LinearProgramming — see the package QuantileRegression.m provided by the MathematicaForPrediction project at GitHub. Also see this guide for using the function QuantileRegressionFit provided by the package. The guide has some theoretical explanations and shows how the quantile regression problem can be formulated as a linear programming problem.

Below are presented several examples of regression quantiles computed over different data sets with the function QuantileRegressionFit provided by the package. The signature of QuantileRegressionFit is very similar to that of Mathematica‘s function Fit. Here is the definition statement:

QuantileRegressionFit::usage = “QuantileRegressionFit[data,funs,var,qs] finds the regression quantiles corresponding to the quantiles qs for a list of data as linear combinations of the functions funs of the variable var.”

QuantileRegressionFit has a Method option and can use both Minimize and LinearProgramming, but the computations with Minimize are quite slow for larger data sets. (The implementation with Minimize is included for didactic purposes.) All QuantileRegressionFit results presented below are done with the linear programming implementation.

First let us consider data generated by adding skewed noise over a logarithmic curve:

Logarithmic curve with skewed noise

Pretending that we don’t know how the data is generated, just by looking at the plot, we might conjecture that the model for the data is

Y = b0 + b1 X + b2 X^(1/2) + b3 log(X) .

Using this model we find regression quantiles corresponding to 0.05, 0.25, 0.5, 0.75, 0.95. For comparison we also find the least squares fit of the model. Here is a plot that shows the regression quantiles for that data together with the least squares fit of the model functions:

Regression quantiles for a logarithmic curve with skewed noise

We can check do the regression quantiles partition the data in the expected way. Here is a table that shows the fraction of the data points with Y’s greater or equal than the corresponding regression quantile:

Logarithmic curve with skewed noise q-table

Here is a plot of the with the regression quantiles of a more interesting set of data:

Regeression quantiles for sin noise over a parabola

Here is the table for the data partition tests:

Sin noise over a parabola q-table

I further made some performance profiling tests. First we need to choose a family or several families of test data. Also, since Mathematica’s function LinearProgramming has several methods it is a good idea to test with all of them. Here I am going to show results only with one family of data and two LinearProgramming methods. The data family is the skewed noise over a logarithmic curve used as an example above. The first LinearProgramming method is Mathematica‘s  (default) “InteriorPoint”, the second method is “CLP” that uses the built-in COIN-OR CLP optimizer. I run the profiling tests using one quantile {0.5} and five quantiles {0.05, 0.25, 0.5, 0.75, 0.95}, which are shown in blue and red respectively. I also run tests with different number of model functions {1, x, x^(1/2), log(x)} and {1, x, log(x)} but there was no significant difference in the timings (less than 2%).

Performance profile InteriorPoint method

Performance profile CLP method

It is interesting to note that the average ratio of the timings with 1 vs. 5 quantiles is 0.38 for “InteriorPoint” and 0.5 for “CLP”.

In another post I am going to show examples of the robustness of regression quantiles when the data has outliers.

Update: Because I implemented calculation of regression quantiles with B-splines I had to rename the function QuantileRegression into QuantileRegressionFit. I made the corresponding changes in the blog post. (Except the profile graphics captions.)

Advertisements

4 thoughts on “Quantile regression through linear programming

  1. Pingback: Quantile regression robustness | Mathematica for prediction algorithms

  2. Pingback: Find Fit for Non-linear data | Mathematica for prediction algorithms

  3. Pingback: Directional quantile envelopes | Mathematica for prediction algorithms

  4. Pingback: Finding local extrema in noisy data using Quantile Regression | Mathematica for prediction algorithms

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s