lrtutorial(3) Linear/ridge regression tutorial (linear_regression)

Introduction

Linear regression and ridge regression are simple machine learning techniques that aim to estimate the parameters of a linear model. Assuming we have $n$ predictor points $mathbf{x_i}, 0


nsionality $d$ and $n$ responses $y_i, 0
are trying to estimate the best fit for $l

\.PP for each predictor $mathbf{x_i}$ and response $y_i$. If we take each predictor $mathbf{x_i}$ as a row in the matrix $mathbf{X}$ and each response $y_i$ as an entry of the vector $mathbf{y}$, we can represent the model in vector form:

\.PP The result of this method is the vector $mathbf{ offset term (or intercept term) $s:

  • a simple command-line executable to perform linear regression or ridge regression
  • a simple C++ interface to perform linear regression or ridge regression

Table of Contents

A list of all the sections this tutorial contains.

Introduction
Table of Contents
Command-Line 'linear_regression'
One file, generating the function coefficients
Compute model and predict at the same time
Prediction using a precomputed model
Using ridge regression

The 'LinearRegression' class
Generating a model
Setting a model
Load a model from a file
Prediction
Setting lambda for ridge regression

  • Further documentation

Command-Line 'linear_regression'

The simplest way to perform linear regression or ridge regression in mlpack is to use the linear_regression executable. This program will perform linear regression and place the resultant coefficients into one file.

The output file holds a vector of coefficients in increasing order of dimension; that is, the offset term ( $ coefficient for dimension 1 ( $n dimension 2 ( $ll as the intercept. This executable can also predict the $y$ values of a second dataset based on the computed coefficients.

Below are several examples of simple usage (and the resultant output). The '-v' option is used so that verbose output is given. Further documentation on each individual option can be found by typing

$ linear_regression --help

One file, generating the function coefficients

$ linear_regression --input_file dataset.csv -v
[INFO ] Loading 'dataset.csv' as CSV data.
[INFO ] Saving CSV data to 'parameters.csv'.
[INFO ]
[INFO ] Execution parameters:
[INFO ]   help: false
[INFO ]   info: ""
[INFO ]   input_file: dataset.csv
[INFO ]   input_responses: ""
[INFO ]   lambda: 0
[INFO ]   output_file: parameters.csv
[INFO ]   output_predictions: predictions.csv
[INFO ]   test_file: ""
[INFO ]   verbose: true
[INFO ]
[INFO ] Program timers:
[INFO ]   load_regressors: 0.006461s
[INFO ]   regression: 0.000347s
[INFO ]   total_time: 0.026589s

Convenient program timers are given for different parts of the calculation at the bottom of the output, as well as the parameters the simulation was run with. Now, if we look at the output file, which, unless specified, is parameters.csv:

$ cat dataset.csv
0,0
1,1
2,2
3,3
4,4
$ cat parameters.csv
-0.0000000000e+00,1.0000000000e+00

As you can see, the function for this input is $f(y)=0+1x_1$. Keep in mind that in this example, the regressors for the dataset are the second column. That is, the dataset is one dimensional, and the last column has the $y$ values, or responses, for each row. You can specify these responses in a separate file if you want, using the --input_responses, or -r, option.

Compute model and predict at the same time

$ linear_regression --input_file dataset.csv --test_file predict.csv -v
[INFO ] Loading 'dataset.csv' as CSV data.
[INFO ] Saving CSV data to 'parameters.csv'.
[INFO ] Loading 'predict.csv' as CSV data.
[INFO ] Saving CSV data to 'predictions.csv'.
[INFO ]
[INFO ] Execution parameters:
[INFO ]   help: false
[INFO ]   info: ""
[INFO ]   input_file: dataset.csv
[INFO ]   input_responses: ""
[INFO ]   lambda: 0
[INFO ]   model_file: ""
[INFO ]   output_file: parameters.csv
[INFO ]   output_predictions: predictions.csv
[INFO ]   test_file: predict.csv
[INFO ]   verbose: true
[INFO ]
[INFO ] Program timers:
[INFO ]   load_regressors: 0.000360s
[INFO ]   load_test_points: 0.000090s
[INFO ]   prediction: 0.000006s
[INFO ]   regression: 0.000335s
[INFO ]   total_time: 0.001522s
$ cat dataset.csv
0,0
1,1
2,2
3,3
4,4
$ cat parameters.csv
-0.0000000000e+00,1.0000000000e+00
$ cat predict.csv
2
3
4
$ cat predictions.csv
2.0000000000e+00
3.0000000000e+00
4.0000000000e+00

We used the same dataset, so we got the same parameters. The key thing to note about the predict.csv dataset is that it has the same dimensionality as the dataset used to create the model, one. Generally, if the model generating dataset has $d$ dimensions, so must the dataset we want to predict for.

Prediction using a precomputed model

$ linear_regression --model_file parameters.csv --test_file predict.csv -v
[INFO ] Loading 'parameters.csv' as CSV data.
[INFO ] Loading 'predict.csv' as CSV data.
[INFO ] Saving CSV data to 'predictions.csv'.
[INFO ]
[INFO ] Execution parameters:
[INFO ]   help: false
[INFO ]   info: ""
[INFO ]   input_file: ""
[INFO ]   input_responses: ""
[INFO ]   lambda: 0
[INFO ]   model_file: parameters.csv
[INFO ]   output_file: parameters.csv
[INFO ]   output_predictions: predictions.csv
[INFO ]   test_file: predict.csv
[INFO ]   verbose: true
[INFO ]
[INFO ] Program timers:
[INFO ]   load_model: 0.009519s
[INFO ]   load_test_points: 0.000067s
[INFO ]   prediction: 0.000007s
[INFO ]   total_time: 0.010081s
$ cat parameters.csv
-0.0000000000e+00,1.0000000000e+00
$ cat predict.csv
2
3
4
$ cat predictions.csv
2.0000000000e+00
3.0000000000e+00
4.0000000000e+00

Using ridge regression

Sometimes, the input matrix of predictors has a covariance matrix that is not invertible, or the system is overdetermined. In this case, ridge regression is useful: it adds a normalization term to the covariance matrix to make it invertible. Ridge regression is a standard technique and documentation for the mathematics behind it can be found anywhere on the Internet. In short, the covariance matrix

\.PP is replaced with

\.PP where $mathbf{I}$ is the identity matrix. So, a $


$ parameter greater than zero should be specified to perform ridge regression, using the --lambda (or -l) option. An example is given below.

$ linear_regression --input_file dataset.csv -v --lambda 0.5
[INFO ] Loading 'dataset.csv' as CSV data.  Size is 3 x 1000.
[INFO ] Saving CSV data to 'parameters.csv'.
[INFO ]
[INFO ] Execution parameters:
[INFO ]   help: false
[INFO ]   info: ""
[INFO ]   input_file: test_data_3_1000.csv
[INFO ]   input_responses: ""
[INFO ]   lambda: 0.5
[INFO ]   model_file: ""
[INFO ]   output_file: parameters.csv
[INFO ]   output_predictions: predictions.csv
[INFO ]   test_file: ""
[INFO ]   verbose: true
[INFO ]
[INFO ] Program timers:
[INFO ]   load_regressors: 0.005236s
[INFO ]   loading_data: 0.005208s
[INFO ]   regression: 0.013206s
[INFO ]   saving_data: 0.000276s
[INFO ]   total_time: 0.020019s

Further documentation on options should be found by using the --help option.

The 'LinearRegression' class

The 'LinearRegression' class is a simple implementation of linear regression.

Using the LinearRegression class is very simple. It has two available constructors; one for generating a model from a matrix of predictors and a vector of responses, and one for loading an already computed model from a given file.

The class provides one method that performs computation:

void Predict(const arma::mat& points, arma::vec& predictions);

Once you have generated or loaded a model, you can call this method and pass it a matrix of data points to predict values for using the model. The second parameter, predictions, will be modified to contain the predicted values corresponding to each row of the points matrix.

Generating a model

#include <mlpack/methods/linear_regression/linear_regression.hpp>
using namespace mlpack::regression;
arma::mat data; // The dataset itself.
arma::vec responses; // The responses, one row for each row in data.
// Regress.
LinearRegression lr(data,responses);
// Get the parameters, or coefficients.
arma::vec parameters = lr.Parameters();

Setting a model

Assuming you already have a model and do not need to create one, this is how you would set the parameters for a LinearRegression instance.

arma::vec parameters; // Your model.
LinearRegression lr(); // Create a new LinearRegression instance or reuse one.
lr.Parameters() = parameters; // Set the model.

Load a model from a file

If you have a generated model in a file somewhere you would like to load and use, you can simply pass it to the LinearRegression initializer like so.

std::string filename; // The path and name of your file.
LinearRegression lr(filename); // Will load the model internally.

Prediction

Once you have generated or loaded a model using one of the aforementioned methods, you can predict values for a dataset.

LinearRegression lr();
// Load or generate your model.
// The dataset we want to predict on; each row is a data point.
arma::mat points;
// This will store the predictions; one row for each point.
arma::vec predictions;
lr.Predict(points, predictions); // Predict.
// Now, the vector 'predictions' will contain the predicted values.

Setting lambda for ridge regression

As discussed in Using ridge regression, ridge regression is useful when the covariance of the predictors is not invertible. The standard constructor can be used to set a value of lambda:

#include <mlpack/methods/linear_regression/linear_regression.hpp>
using namespace mlpack::regression;
arma::mat data; // The dataset itself.
arma::vec responses; // The responses, one row for each row in data.
// Regress, with a lambda of 0.5.
LinearRegression lr(data, responses, 0.5);
// Get the parameters, or coefficients.
arma::vec parameters = lr.Parameters();

In addition, the Lambda() function can be used to get or modify the lambda value:

LinearRegression lr;
lr.Lambda() = 0.5;
Log::Info << "Lambda is " << lr.Lambda() << "." << std::endl;

Further documentation

For further documentation on the LinearRegression class, consult the complete API documentation.