0

Introduction

Let’s assume that I need the Jacobian matrix for the following set of ODE:

dxdt[ 0 ] = -90.0 * x[0] - 50.0 * x[1];
dxdt[ 1 ] = x[0] + 3*x[1];
dxdt[ 2 ] = x[1] + 50*x[2];

In Matlab/Octave this would be pretty easy:

syms x0 x1 x2;
f = [-90.0*x0-50.0*x1, x0+3*x1, x1+50*x2]
v=[x0, x1, x2]
fp = jacobian(f,v)

This would results with following output matrix:

[-90  -50  0 ]
[ 1    3   0 ]
[ 0    1   50]

What I need

Now I want to reproduce the same results in C++. I can’t compute the Jacobian before and hard-code it, as it will depend for example on user inputs and time. So my question is: How to do this? Usually for mathematics operations, I use the Boost library, however in this case I can’t find any solution. There’s only short note about this in implicit systems, but the following code doesn’t work:

sys.second( x , jacobi , t )

It also requests the time (t), so it probably doesn’t generate an analytic form of solution. Do I misunderstand the documentations? Or should I use another function? I would prefer to stay within Boost, as I need the Jacobian as ublas::matrix and I want to avoid conversion.

EDIT:

More specific I will use Jacobian inside rosenbrock4 ODE solver. Example here - lines 47-52. I need automatic generation of this structure as the ODE set may be changed later and I want to avoid manually rewriting Jacobian ever time. Also some variables inside ODE definitions are not constant in time.

Karls
  • 731
  • 7
  • 17

2 Answers2

3

I know this is long after the fact, but I have recently been wanting to do the same thing and have come across many auto differentiation (AD) libraries that do this pretty well. I have mostly been using Eigen's AD because I am already using Eigen everywhere. Here's an example of how you can use Eigen's AD to get the jacobian like you asked.

There's also a long list of c++ AD libraries on autodiff.org.

Hope this helps someone!

mmmfarrell
  • 31
  • 2
  • Isn't AD more complicated than numerical solution? I've been trying to use jacobian to speed-up numerical method and I worry that it could be a conflict. I'm not familiar with this tool. And how abut the change on jacobian in time? If i had to recompute it manual every time, it may be slower than my previous method without jacobian. Or maybe is it possible to integrate both - jacobian and ODE solving in one library to automatize? – Karls Feb 05 '19 at 20:32
  • Speed shouldn't be a factor, the benefit of AD is that you get the jacobian for free after computing the function. You can read more into how this happens with dual numbers, etc. but I don't think speed should be an issue. From the example you posted it seems like AD would be very easy to implement for you because most AD libraries I have seen use functors like you already have in your [example](https://github.com/headmyshoulder/odeint-v2/blob/5dd9519b7b603b7dd3d25d06900012405667426a/examples/stiff_system.cpp#L43). Your functor would just only return the ode output and AD gives you the jac – mmmfarrell Feb 06 '19 at 15:53
  • can you refer me to an example on how to do this? Basing on the information you provided so far I'm not able to find how to connect this two parts of my code. As you may see basing on my Matlab example, i'm closer to scientist than real programmer. – Karls Feb 06 '19 at 17:36
  • Here's an example using an AD library that I have helped develop a little. It is built around Eigen's AD. The [test here](https://github.com/superjax/nano_autodiff/blob/e698746076698c4db6060be61f25753a8f81548f/src/test_autodiff_dynamics.cpp#L73) using AD to compute the jacobians and compares against analytical jacobians. If this isn't what you are trying to do, maybe I am misunderstanding. – mmmfarrell Feb 06 '19 at 21:03
  • I need to integrate it with ODE solver, so static jacobian for t=0 will not help. – Karls Feb 07 '19 at 13:25
2

The Jacobian is based on derivatives of the function. If the function f is only known at run-time (and there are no constraints such as linearity), you have to automatise the differentiation. If you want this to happen exactly (as opposed to a numerical estimation), you need to use symbolic computation. Look for example here and here for libraries supporting this.

Note that the Jacobian usually depends on the state and the time, so it’s impossible to represent it as a constant matrix (such as in your example), unless your problem is so boring that you can solve it analytically anyway.

Wrzlprmft
  • 4,234
  • 1
  • 28
  • 54
  • Yes, it will not be constant matrix but I expected I can pass some variables inside, which my program will update. I need Jacobian for ODE solver, look for exampel here: [link](https://github.com/headmyshoulder/odeint-v2/blob/master/examples/stiff_system.cpp). In lines 47-52 is Jacobian structure definition. As I have complicated set of ODE and maybe I will change it later, I need automatic generation of this. What method would you recommend for this specific problem? I also edited Question to be more specific. – Karls Sep 16 '17 at 18:20
  • @Karls: *What method would you recommend for this specific problem?* – That’s really hard to say and strongly depends on where your priorities are (runtime, usability, etc.). If you want maximum efficiency, you need code-writing routines or meta programming. (If you’re not fixed on C++, you might want to take a look at [this Python module](https://github.com/neurophysik/jitcode) of mine.) – Wrzlprmft Sep 16 '17 at 18:46
  • I'm fixed on c++ this time so thank you for your module but I can't use it here. I'm focused on make it works first as now I don't have any functional solution. I need Jacobian-generating algorithm for `rosenbrock4` ODE solver. I think, it has to be ready-to-use solution like in MATLAB, instead of manually writing another solving method. – Karls Sep 16 '17 at 19:17