When I use numba inside python I know that if I try to jit-compile functions which have arbitrary precision floats (mpmath) inside their loops, it will fail to compile in nopython mode and its speed will be same as plain python version. My question is about Julia package DifferentialEquations.jl
. On their main page they say that it supports BigFloats and ArbFloats. I understand that this package also uses loops which are jit-compiled by default by julia. So my question is whether or not DifferentialEquations.jl
functions are jit-compiled when I pass differential equations which use BigFloat numbers.
Asked
Active
Viewed 417 times
4

Chris Rackauckas
- 18,645
- 3
- 50
- 81

Giorgi Bakhtadze
- 249
- 1
- 6
1 Answers
4
Yes, they are via function auto-specialization. In Julia, functions will auto-specialize on the concrete types when JIT compiling. This is true for all numbers, and in fact even things like Float64 are just types defined in Julia itself and use these same mechanisms. This blog post describes this pattern in more detail

Chris Rackauckas
- 18,645
- 3
- 50
- 81
-
Wasn't the question about numba in Python? – user48956 May 30 '18 at 16:13
-
2No. The question starts by mentioning the Numba/Python cannot optimize these types of functions (pretty much anything with non-Float64 arrays, so arbitrary precision arrays), and then goes on to ask if Julia and DifferentialEquations.jl do. The answer is yes, Julia and DifferentialEquations.jl do optimize functions like this using their standard function evaluation. So this really has no Numba in the question, just as an opening statement. – Chris Rackauckas May 30 '18 at 16:37