Automated Sparse Analytical Jacobians

In many cases where you have large stiff differential equations, getting a sparse Jacobian can be essential for performance. In this tutorial we will show how to use modelingtoolkitize to regenerate an ODEProblem code with the analytical solution to the sparse Jacobian, along with the sparsity pattern required by DifferentialEquations.jl's solvers to specialize the solving process.

First let's start out with an implementation of the 2-dimensional Brusselator partial differential equation discretized using finite differences:

using DifferentialEquations, ModelingToolkit

const N = 32
const xyd_brusselator = range(0,stop=1,length=N)
brusselator_f(x, y, t) = (((x-0.3)^2 + (y-0.6)^2) <= 0.1^2) * (t >= 1.1) * 5.
limit(a, N) = a == N+1 ? 1 : a == 0 ? N : a
function brusselator_2d_loop(du, u, p, t)
  A, B, alpha, dx = p
  alpha = alpha/dx^2
  @inbounds for I in CartesianIndices((N, N))
    i, j = Tuple(I)
    x, y = xyd_brusselator[I[1]], xyd_brusselator[I[2]]
    ip1, im1, jp1, jm1 = limit(i+1, N), limit(i-1, N), limit(j+1, N), limit(j-1, N)
    du[i,j,1] = alpha*(u[im1,j,1] + u[ip1,j,1] + u[i,jp1,1] + u[i,jm1,1] - 4u[i,j,1]) +
                B + u[i,j,1]^2*u[i,j,2] - (A + 1)*u[i,j,1] + brusselator_f(x, y, t)
    du[i,j,2] = alpha*(u[im1,j,2] + u[ip1,j,2] + u[i,jp1,2] + u[i,jm1,2] - 4u[i,j,2]) +
                A*u[i,j,1] - u[i,j,1]^2*u[i,j,2]
    end
end
p = (3.4, 1., 10., step(xyd_brusselator))

function init_brusselator_2d(xyd)
  N = length(xyd)
  u = zeros(N, N, 2)
  for I in CartesianIndices((N, N))
    x = xyd[I[1]]
    y = xyd[I[2]]
    u[I,1] = 22*(y*(1-y))^(3/2)
    u[I,2] = 27*(x*(1-x))^(3/2)
  end
  u
end
u0 = init_brusselator_2d(xyd_brusselator)
prob = ODEProblem(brusselator_2d_loop,u0,(0.,11.5),p)
ODEProblem with uType Array{Float64, 3} and tType Float64. In-place: true
timespan: (0.0, 11.5)
u0: 32×32×2 Array{Float64, 3}:
[:, :, 1] =
 0.0  0.121344  0.326197  0.568534  …  0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534  …  0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 ⋮                                  ⋱                      ⋮         
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534  …  0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534  …  0.568534  0.326197  0.121344  0.0
 0.0  0.121344  0.326197  0.568534     0.568534  0.326197  0.121344  0.0

[:, :, 2] =
 0.0       0.0       0.0       0.0       …  0.0       0.0       0.0
 0.148923  0.148923  0.148923  0.148923     0.148923  0.148923  0.148923
 0.400332  0.400332  0.400332  0.400332     0.400332  0.400332  0.400332
 0.697746  0.697746  0.697746  0.697746     0.697746  0.697746  0.697746
 1.01722   1.01722   1.01722   1.01722      1.01722   1.01722   1.01722
 1.34336   1.34336   1.34336   1.34336   …  1.34336   1.34336   1.34336
 1.66501   1.66501   1.66501   1.66501      1.66501   1.66501   1.66501
 1.97352   1.97352   1.97352   1.97352      1.97352   1.97352   1.97352
 2.26207   2.26207   2.26207   2.26207      2.26207   2.26207   2.26207
 2.52509   2.52509   2.52509   2.52509      2.52509   2.52509   2.52509
 ⋮                                       ⋱            ⋮         
 2.26207   2.26207   2.26207   2.26207      2.26207   2.26207   2.26207
 1.97352   1.97352   1.97352   1.97352      1.97352   1.97352   1.97352
 1.66501   1.66501   1.66501   1.66501   …  1.66501   1.66501   1.66501
 1.34336   1.34336   1.34336   1.34336      1.34336   1.34336   1.34336
 1.01722   1.01722   1.01722   1.01722      1.01722   1.01722   1.01722
 0.697746  0.697746  0.697746  0.697746     0.697746  0.697746  0.697746
 0.400332  0.400332  0.400332  0.400332     0.400332  0.400332  0.400332
 0.148923  0.148923  0.148923  0.148923  …  0.148923  0.148923  0.148923
 0.0       0.0       0.0       0.0          0.0       0.0       0.0

Now let's use modelingtoolkitize to generate the symbolic version:

sys = modelingtoolkitize(prob);

Now we regenerate the problem using jac=true for the analytical Jacobian and sparse=true to make it sparse:

sparseprob = ODEProblem(sys,Pair[],(0.,11.5),jac=true,sparse=true)
ODEProblem with uType Vector{Float64} and tType Float64. In-place: true
timespan: (0.0, 11.5)
u0: 2048-element Vector{Float64}:
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 ⋮
 2.2620667554742258
 1.9735248771761977
 1.665005111992191
 1.3433640166822103
 1.0172186542655526
 0.6977464117458191
 0.4003323380813969
 0.14892258453196738
 0.0

Hard? No! How much did that help?

using BenchmarkTools
@btime solve(prob,save_everystep=false) # 51.714 s (7317 allocations: 70.12 MiB)
@btime solve(sparseprob,save_everystep=false) # 2.880 s (55533 allocations: 885.09 MiB)

Notice though that the analytical solution to the Jacobian can be quite expensive. Thus in some cases we may only want to get the sparsity pattern. In this case, we can simply do:

sparsepatternprob = ODEProblem(sys,Pair[],(0.,11.5),sparse=true)
@btime solve(sparsepatternprob,save_everystep=false) # 2.880 s (55533 allocations: 885.09 MiB)