0

Task:

As an example, we have 3 input vectors:

foo = [1, 2, 3, 4, 5, 6]
bar = [50, 60, 70, 80, 90, 100]
spam = [-10, -20, -30, -40, -50, -60]

Also, we have 4 output vectors that have linear dependency from input vectors:

foofoo = [1, 1, 2, 2, 3, 3]
barbar = [4, 4, 5, 5, 6, 6]
spamspam = [7, 7, 8, 8, 9, 9]
hamham = [10, 10, 11, 11, 12, 12]

How to use Linear Regression at this data in Python?

A. Innokentiev
  • 681
  • 2
  • 11
  • 27

1 Answers1

1

You can use OLS (Ordinary Least Squares model) as done here:

#imports
import numpy as np
import statsmodels.api as sm

#generate the input matrix
X=[foo,bar,spam]
#turn it into a numpy array
X = np.array(X).T
#add a constant column
X=sm.add_constant(X)

This gives the input matrix X:

array([[   1.,    1.,   50.,  -10.],
       [   1.,    2.,   60.,  -20.],
       [   1.,    3.,   70.,  -30.],
       [   1.,    4.,   80.,  -40.],
       [   1.,    5.,   90.,  -50.],
       [   1.,    6.,  100.,  -60.]])

And now you can fit each desired output vector:

resFoo = sm.OLS(endog=foofoo, exog=X).fit()
resBar = sm.OLS(endog=barbar, exog=X).fit()
resSpam = sm.OLS(endog=spamspam, exog=X).fit()
resham = sm.OLS(endog=hamham, exog=X).fit()

The result gives you the coefficients (for the constant, and the three columns foo, bar, and spam):

>>> resFoo.params
array([-0.00063323,  0.0035345 ,  0.01001583, -0.035345  ])

You can now check it with the input:

>>> np.matrix(X)*np.matrix(resFoo.params).T
matrix([[ 0.85714286],
        [ 1.31428571],
        [ 1.77142857],
        [ 2.22857143],
        [ 2.68571429],
        [ 3.14285714]])

Which is close to the desired output of foofoo.


See this question for different ways to do the regression: Multiple linear regression in Python

Community
  • 1
  • 1
agold
  • 6,140
  • 9
  • 38
  • 54