1

I have a function on the form y = a/(a-(x+b)) and several couple of data {x, y}. I need to approximate a and b using those data, so maybe a non linear regression.

How can I do that, and possibly implement it in C# ?

Thanks for your answers. Bastien

Spektre
  • 49,595
  • 11
  • 110
  • 380
B Legu
  • 13
  • 2
  • In case you can not solve this algebraically or geometrically then: either you do `O(n^2)` brute force search or if the `a,b` have monotonic dependence on `y` and `x` then you can use binary search `O(log^2(n))` but for arbitrary approximation you can use the answers in the duplicate QA I linked (see the examples in there). – Spektre Nov 10 '17 at 15:33
  • Why can't you do a least squares fit to approximate the function? How accurate do you need to be? A conjugate gradient solution will get you there. Your range has to exclude the point x = (a-b); it's singular there. – duffymo Nov 10 '17 at 15:41
  • [Math.Net](https://numerics.mathdotnet.com/regression.html) has some linear regression stuff that might help you. – Matthew Watson Nov 10 '17 at 15:44

1 Answers1

1

Nonlinear regression can be hard. If you know how to do linear regression, you can transform your function/data to make it to a linear regression problem, In your case, if you define a new variable z=1/y, then your problem is going to be:

find a line with equation a + b - x = a z that best fits the data set {(x0, z0=1/y0), (x1, z1=1/y1), ...}

triple_r
  • 1,037
  • 1
  • 8
  • 21