# Performing Global Sensitivity Analysis

JuliaSimModelOptimizer helps users to consider additional model exploration based on Global Sensitivity Analysis (GSA). This approach is helpful to further investigate how the model output reacts to variance in the model input. For example, we can analyse quantitatively how the variance in single input variables and how combinations of variances in several input variables affect output variables as shown in the figure below:

1. Influence of first input variable,
2. Influence of second input variable,
3. Influence of third input variable,
4. Influence of second and third input variable.

The GlobalSensitivity package provides a large variety of algorithms via gsa(f, method). The function that is used to compuite the sensitivity can be based on simulate, so that one can take advantage of already specified experiments. [1]

For example let's consider the predator prey model and perform GSA using the Morris method.

using JuliaSimModelOptimizer
using GlobalSensitivity
using Statistics
using OrdinaryDiffEq
using ModelingToolkit
import ModelingToolkit: D_nounits as D, t_nounits as t
using Plots

function lotka()
@variables x(t)=3.1 y(t)=1.5
@parameters α=1.3 β=0.9 γ=0.8 δ=1.8
eqs = [
D(x) ~ α * x - β * x * y,
D(y) ~ -δ * y + γ * x * y,
]
@named sys = ODESystem(eqs, t)
end

sys = complete(lotka())

tspan = (0.0, 10.0)

experiment = Experiment(nothing, sys;
tspan,
saveat = range(0, stop = 10, length = 200),
alg = Tsit5()
)

prob = InverseProblem(experiment,
[
sys.α => (1.0, 5.0),
sys.β => (1.0, 5.0),
sys.γ => (1.0, 5.0),
sys.δ => (1.0, 5.0),
])

function f(x)
sol = simulate(experiment, prob, x)
[mean(sol[:x]), maximum(sol[:y])]
end

lb = lowerbound(prob)
ub = upperbound(prob)

param_range = [[l, u] for (l, u) in zip(lb, ub)]

m = gsa(f, Morris(total_num_trajectory=1000, num_trajectory=150), param_range)
GlobalSensitivity.MorrisResult{Matrix{Float64}, Vector{Any}}([0.060773373633124655 -0.01695059997490567 -0.40212593519609346 0.37476250842042363; 0.6279946287674568 -2.420450758779622 1.1043812521149359 -0.5827920866007001], [0.20258886811323196 0.03966647981823347 0.40212593519609346 0.37487669321653094; 0.6370903893608789 2.421223707196384 1.1140012636042966 0.5931068127541326], [0.13283779069249815 0.004025530940734585 0.2571587733337627 0.15815678200073865; 0.5951929242917672 15.963767331833905 0.4656080720103586 0.19761011826625854], Any[[[0.2788993659070342, 0.04846524154211875], [0.2637091075687319, 0.11768300490463092], [0.25781454669851783, 0.28207186377640875], [0.24025170067785417, 0.14292446091422614], [-0.22421196998198686, -0.019183649416733436], [-0.22421196998198686, -0.019183649416733436], [-0.23163922954777472, 0.43221110731337997], [-0.14563751767420702, 0.42423115917857684], [-0.15326587664443067, 0.41673089619995285], [-0.0922153229176776, 0.23139872301936842]  …  [-0.08037286562447664, 0.1790961545359569], [-0.1760634902960355, 0.4028982787393092], [-0.18731301988046561, 0.39954745674771847], [-0.019681002462144447, 1.0263641946083193], [-0.019681002462144447, 1.0263641946083193], [0.0822366772701121, 1.0132178211777412], [0.161639535113799, 1.3864982247304694], [0.22944659980852103, 1.190690967186382], [0.22944659980852103, 1.190690967186382], [0.6596385491261957, 1.4575897359510865]], [[-0.053583718847821635, -0.23881399580305765], [0.03143075171191483, -0.3929345579465988], [0.02963711858393789, -0.6768933985721713], [0.0067992757496565925, -0.6966039051756486], [0.014450044163771786, -0.7227874783761955], [0.009080252272420927, -0.5181017890490455], [-0.06175504019391669, -0.0933282808609542], [-0.06175504019391669, -0.0933282808609542], [-0.001938151353674937, -1.7265572187244451], [-0.12688669945623135, -2.5927210895726305e-5]  …  [0.008129151200151345, -1.2875643567000072], [0.00905005391890809, -1.1504543266969358], [0.0075334964645993804, -1.2394562654075998], [0.005049010668245933, -1.5063812027314363], [0.005861792738105349, -0.3204007452887773], [-0.02346623933952858, -2.6188955083885954], [0.02172727850568105, -3.570545643886327], [-0.025884979683345972, -4.404187042791875], [-0.030631225748665863, -4.64020346407812], [-0.030631225748665863, -4.64020346407812]], [[-0.8116256073729731, 0.792185992547477], [-0.8116256073729731, 0.792185992547477], [-0.8116256073729731, 0.792185992547477], [-0.11721878098989819, 0.5278436604087414], [-0.11721878098989819, 0.5278436604087414], [-0.11805098902206997, 0.6903420840519637], [-0.10903764909939777, 0.604590162556903], [-0.10804548738199046, 0.5192065468424499], [-0.10682156487444344, 0.7314298569277359], [-0.12598105554468184, 0.6238325428300856]  …  [-0.026061778234138508, 0.8909673831444975], [-0.029793200780860608, 0.8948611396978094], [-0.032669811433162596, 0.8934086795815932], [-0.3218764636536428, 1.9085883065058464], [-0.3450682658388303, 1.8013607603649118], [-0.3710450847321988, 2.0305192417645235], [-0.7855047508519194, 1.9282360921387254], [-0.421424264022697, 2.143566774968123], [-0.421424264022697, 2.143566774968123], [-0.421424264022697, 2.143566774968123]], [[0.7699706647816106, -0.18136514684105826], [0.056577021519028985, -0.5861333810718327], [0.056577021519028985, -0.5861333810718327], [0.08390493958155333, -0.451443101095931], [0.10857582878839558, -0.31525013217433434], [0.10857582878839558, -0.31525013217433434], [0.22958953304984178, -0.1472703751713616], [0.22958953304984178, -0.1472703751713616], [0.008497835786111746, -0.628215126792257], [0.008497835786111746, -0.628215126792257]  …  [0.012758723894161524, -0.7109504017677637], [0.015445439038596978, -0.7097937722310506], [0.5922362009873757, -1.0568905476837567], [0.7215308873070466, -0.925264790745107], [0.8000289569875898, -0.9452465420801589], [0.8539875497207541, -0.8808857315731535], [0.8296467126828927, -0.7254761406345607], [0.7992086481069162, -0.9270053542014582], [0.7381515484956723, -1.0622679595653226], [0.7231732425645201, -1.034325009901344]]])

We can plot the results using

scatter(m.means[1,:], m.variances[1,:], series_annotations=[:α,:β,:y,:δ], color=:gray)

and

scatter(m.means[2,:], m.variances[2,:], series_annotations=[:α,:β,:y,:δ], color=:gray)

For the Sobol method, we can similarly do:

m = gsa(f, Sobol(), param_range, samples=1000)
GlobalSensitivity.SobolResult{Matrix{Float64}, Nothing, Nothing, Nothing}([-0.0008389428883520519 -8.756670289610017e-5 0.5453477447320645 0.35422274149024063; 0.06407858586209089 0.5004449209741907 0.23566374011216046 0.0656586576659527], nothing, nothing, nothing, [0.00626573037556132 0.003431279478512331 0.6315860428169109 0.4426612886457248; 0.13079977909017643 0.6194845252432749 0.2860124582905267 0.09457763813881277], nothing)

See the documentation for GlobalSensitivity for more details.

• 1Derived from https://docs.sciml.ai/GlobalSensitivity/stable/tutorials/parallelized_gsa/ from from https://github.com/SciML/SciMLDocs, MIT licensed, see repository for details.