2

I have some data that doesn't fit a linear regression:

enter image description here

In fact should fit a quadratic function 'exactly':

P = R*I**2 

I'm making this:

model = sklearn.linear_model.LinearRegression()

X = alambres[alambre]['mediciones'][x].reshape(-1, 1)
Y = alambres[alambre]['mediciones'][y].reshape(-1, 1)
model.fit(X,Y)

Is there any chance to solve it by doing something like:

model.fit([X,X**2],Y)
iacob
  • 20,084
  • 6
  • 92
  • 119
Luis Ramon Ramirez Rodriguez
  • 9,591
  • 27
  • 102
  • 181
  • 1
    An example of how to use `scikit-learn` with polynomial features can be found here http://scikit-learn.org/stable/modules/linear_model.html#polynomial-regression-extending-linear-models-with-basis-functions – piman314 May 09 '16 at 12:53
  • Does this answer your question? [polynomial regression using python](https://stackoverflow.com/questions/31406975/polynomial-regression-using-python) – iacob Mar 28 '21 at 16:14

2 Answers2

5

You can use numpy's polyfit.

import numpy as np
from matplotlib import pyplot as plt
X = np.linspace(0, 100, 50)
Y = 23.24 + 2.2*X + 0.24*(X**2) + 10*np.random.randn(50) #added some noise
coefs = np.polyfit(X, Y, 2)
print(coefs)
p = np.poly1d(coefs)
plt.plot(X, Y, "bo", markersize= 2)
plt.plot(X, p(X), "r-") #p(X) evaluates the polynomial at X
plt.show()

Out:

[  0.24052058   2.1426103   25.59437789]

enter image description here

ayhan
  • 70,170
  • 20
  • 182
  • 203
  • 1
    Sorry I didn't notice that the title asks for a sklearn solution. I'll leave this as an alternative. – ayhan May 09 '16 at 08:58
1

Use PolynomialFeatures.

import numpy as np
from sklearn.preprocessing import PolynomialFeatures

x = np.array([[1,],[2,],[3,]])
X = PolynomialFeatures(degree=2).fit_transform(x)
X

Output:

array([[1., 1., 1.],
       [1., 2., 4.],
       [1., 3., 9.]])
Brad B
  • 11
  • 1
  • You need to combine the polynomial feature generation with a linear regression to perform polynomial regression in SKLearn. – iacob Mar 28 '21 at 16:19