0

I am plotting two graphs on the same curve. Code goes something like this :

x = np.array([86,14,19,24,30,55,41,46])
y1 = np.array([96,86,82,78,80,101,161,32])
y2 = np.array([54,54,48,54,57,76,81,12])
y4 = np.concatenate([y2.reshape(8,1),y1.reshape(8,1)],1)
plot(x,y4)

I need values at some points other than in x array. For example: for values 30,50 and 78.

There is an option to read the data from graph in Origin using "Screen Reader" (i.e. when I point in the graph, I get the value.).

Is there an equivalent in python?

.plot is matplotlib.plot

  • You can try curve fitting by using SciPy recipes, getting a function to fit the data and then returning the value there. For example, here's a recipe for fitting to a gaussian: http://stackoverflow.com/questions/10143905/python-two-curve-gaussian-fitting-with-non-linear-least-squares – Alex Huszagh Jun 10 '15 at 17:51
  • I just want to read the data. Fitting is an option. Isn't there anything simpler than that? – Abhishek Tripathi Jun 11 '15 at 06:19
  • Is it within the scope or extrapolating? Your example says solely within the scope, in which case fitting is optional and you just choose to average neighboring x values based on a linear model. The error would depend on how the data is aligned. You need some guess within the data, but the guess can be much less accurate (even linear) to approximate since the distance between each point is small. If you're extrapolating, you need to approximate a model for your data. – Alex Huszagh Jun 11 '15 at 06:28
  • within the scope only. Okay. fitting seems the only option. But I am still thinking how does Origin does that. – Abhishek Tripathi Jun 11 '15 at 06:33
  • Ok well if it's solely within the scope I can provide a simple answer. It will work with well-behaved data only. – Alex Huszagh Jun 11 '15 at 06:43
  • Umm.. I don't know if the data is well behaved or not. Have you had a look at "Screen Reader"in Origin? That is the exact thing I want. – Abhishek Tripathi Jun 11 '15 at 06:48
  • Let me know if you have any other questions: that should be good enough for most applications. If you are willing to actually curve fit and then optimize a function (costly, but much more accurate), the error is substantially lower. The benefits of doing so are minimal for good datasets with estimating data within the range (as I show), but substantial for extrapolation. – Alex Huszagh Jun 11 '15 at 07:11
  • Thanks @AlexanderHuszagh ... Function is not linear. So, I guess curve fitting should be the option. Thanks a lot! – Abhishek Tripathi Jun 12 '15 at 09:43
  • I'm trying to get at that if your data is non-linear but not sparse, a linear approximation close between data points is a good approximation. If you've found a useful response, can you mark the question as answered? – Alex Huszagh Jun 12 '15 at 15:33

1 Answers1

0

Here's a simple method, but I'm going to highlight the limitations. It's great since it requires no complex analysis, but if the data is sparse, or deviates significantly from linear behavior, it's a poor approximation.

>>> import matplotlib.pyplot as plt
>>> plt.figure()
<matplotlib.figure.Figure object at 0x7f643a794610>
>>> import numpy as np
>>> bins = np.array(range(10))
>>> quad = bins**2
>>> lin = 7*bins-12
>>> plt.plot(bins, quad, color="blue")
[<matplotlib.lines.Line2D object at 0x7f643298e710>]
>>> plt.plot(bins, lin, color="red")
[<matplotlib.lines.Line2D object at 0x7f64421f2310>]
>>> plt.show()

Plot of the linear fit compared to quadratic.

Here I show a plot with a quadratic function [f(x) = x^2] and then the linear fit between points at (3,9) and (4,16).

I can approximate this easily by getting the slope:

m = (y2-y1)/(x2-x1) = (16-9)/(4-3) = 7

I can then find the linear function for the value at a point, I'll do (3,3):

f(x) = 7x + b
9 = 7(3) + b
b = -12
f(x) = 7x - 12

And now we can approximate any value between 3 and 4.

The error will become unusable quickly, however, within close data points, it's quite accurate. For example:

# quadratic
f(3.5) = 3.5**2 = 12.25
# linear
f(3.5) = 7*3.5 - 12 = 12.5

Since error is just (experimental-theoretical)/(theoretical), we get (12.25-12.5)/(12.5), or an error of -2% (NOT BAD).

However, if we try this for f(50), we get.

# quadratic
f(50) = 2500
# linear
f(50) = 7*50 - 12 = 338

Or an error of 639%.

Alex Huszagh
  • 13,272
  • 3
  • 39
  • 67