Performing Global Sensitivity Analysis
JuliaSimModelOptimizer helps users to consider additional model exploration based on Global Sensitivity Analysis (GSA). This approach is helpful to further investigate how the model output reacts to variance in the model input. For example, we can analyse quantitatively how the variance in single input variables and how combinations of variances in several input variables affect output variables as shown in the figure below:
- Influence of first input variable,
- Influence of second input variable,
- Influence of third input variable,
- Influence of second and third input variable.
The GlobalSensitivity
package provides a large variety of algorithms via gsa(f, method)
. The function that is used to compuite the sensitivity can be based on simulate
, so that one can take advantage of already specified experiments. [1]
For example let's consider the predator prey model and perform GSA using the Morris method.
using JuliaSimModelOptimizer
using GlobalSensitivity
using Statistics
using OrdinaryDiffEq
using ModelingToolkit
import ModelingToolkit: D_nounits as D, t_nounits as t
using Plots
function lotka()
@variables x(t)=3.1 y(t)=1.5
@parameters α=1.3 β=0.9 γ=0.8 δ=1.8
eqs = [
D(x) ~ α * x - β * x * y,
D(y) ~ -δ * y + γ * x * y,
]
@named sys = ODESystem(eqs, t)
end
sys = complete(lotka())
tspan = (0.0, 10.0)
experiment = Experiment(nothing, sys;
tspan,
saveat = range(0, stop = 10, length = 200),
alg = Tsit5()
)
prob = InverseProblem(experiment,
[
sys.α => (1.0, 5.0),
sys.β => (1.0, 5.0),
sys.γ => (1.0, 5.0),
sys.δ => (1.0, 5.0),
])
function f(x)
sol = simulate(experiment, prob, x)
[mean(sol[:x]), maximum(sol[:y])]
end
lb = lowerbound(prob)
ub = upperbound(prob)
param_range = [[l, u] for (l, u) in zip(lb, ub)]
m = gsa(f, Morris(total_num_trajectory=1000, num_trajectory=150), param_range)
GlobalSensitivity.MorrisResult{Matrix{Float64}, Vector{Any}}([0.047152903458654756 -0.020636082233168998 -0.42926488377613725 0.3353046058385953; 0.5525995719236889 -1.7724960880345193 1.0887859290271187 -0.5995085873036511], [0.2197336964632243 0.03861491755306512 0.42926488377613725 0.33532139190326615; 0.5636480644094775 1.7746285422616441 1.1113639664941108 0.6268428014736961], [0.17832844200513742 0.00417213331966804 0.25499046460761626 0.11388811022063318; 0.4903565459253122 8.556021691838858 0.5890221054814194 0.25369002893744463], Any[[[-0.1436800694819428, 0.1697758569981305], [-0.13963046028532694, 0.19132690906565575], [-0.14582589527628825, 0.17228489719396978], [-0.06206148736213195, 0.24314774276485085], [-0.07049519633626536, 0.22128914654544643], [-0.1954060574395416, 0.11330730593258902], [0.5410099190475444, 0.05901321598833665], [0.3973811980576674, 0.03990952623810919], [0.34826236687031537, 0.20511102835433415], [0.34826236687031537, 0.20511102835433415] … [0.43760449122862355, 0.16643936139106244], [0.43760449122862355, 0.16643936139106244], [0.3922238317000918, 0.23075689849208897], [-0.09122285995807636, 0.6763127723676553], [-0.09376074185746208, 0.7073843332566588], [-0.11332838622164036, 0.41256145097672203], [-0.8717812299542085, -0.07340399174691618], [-0.8157819139934546, 0.5717035011980173], [-0.8458810297975271, 0.05733898510066164], [-0.7605643734784603, 0.2445477249319522]], [[0.006659209970825044, -0.35670827261474164], [0.008527252258831775, -0.34190101694546227], [-0.0010374940317331648, -0.5849833014672732], [0.010457662534195973, -0.5605427598948741], [-0.0010374940317331648, -0.5849833014672732], [0.019001103547535803, -0.6022519307506011], [0.003932809529648171, -1.8224427763642155], [0.008622809153067124, -1.6763917471821412], [0.008622809153067124, -1.6763917471821412], [-0.06069584586638749, -0.11718131360503578] … [-0.04844110042318312, -4.311799743243366], [-0.05899806542566305, -4.581510146773929], [-0.00831076611687059, -1.7855886966026073], [-0.006126319529919766, -1.823609667303896], [-0.008125751620254255, -1.9928597168599897], [0.0003408821693378099, -0.7343405843813858], [0.002515231406763252, -0.7334459018508017], [0.008418080821595047, -0.8656765107482002], [0.04893296115618375, -0.1360211160875453], [0.06245036467191721, -0.02283311111770899]], [[-0.2632476746712412, 0.6363701637527736], [-0.2632476746712412, 0.6363701637527736], [-0.25892028196828665, 0.6406049199551802], [-0.0795787922063619, 0.7363485515230557], [-0.09107394877229104, 0.7119080099506567], [-0.06260164221888866, 0.7164979768867331], [-0.1275103856966308, 1.4760004155494326], [-0.12938690578882397, 1.42045277644994], [-0.6648284253920055, 0.4383199843534228], [-0.5972299387646056, 0.38034944514450175] … [-2.1784136047474383, 0.05432032159279699], [-0.7672730524602176, 0.8564583361291221], [-1.5388555358921765, 0.7422876217071015], [-1.5186088396370385, 0.7546158210894174], [-1.5186088396370385, 0.7546158210894174], [-0.08234352754244585, 1.5231145891940951], [-0.07817719435719873, 1.5547261985764447], [-0.07535446957435854, 1.555026483541671], [-3.1744836174947832, -1.608285209883082], [-0.2585997574529782, -0.0015465279771159704]], [[0.14757869277355457, -0.3119704059945714], [0.09024986498424631, -0.6762736236963726], [0.07751804053498251, -0.5855842402569391], [0.07751804053498251, -0.5855842402569391], [0.6002630179562212, -0.17292609642776374], [0.20451458589181934, -0.2963283402861966], [0.20451458589181934, -0.2963283402861966], [0.6849233734224469, -0.3838867215348571], [0.6849233734224469, -0.3838867215348571], [0.6849233734224469, -0.3838867215348571] … [1.9997437499846873, -0.504390038127175], [0.4887874437944733, -0.2682538129168393], [0.5067241504269342, -0.2796803963036898], [0.04379810302754614, -1.164677085758285], [0.7523711510758236, 0.45714609006288526], [0.7501968018383982, 0.45625140753230115], [0.7590686208803614, 0.4644986359013864], [0.7530306086057161, 0.45679189926911146], [0.07728329651923377, -0.17121185151520957], [0.06666831055976198, -0.1703542748949468]]])
We can plot the results using
scatter(m.means[1,:], m.variances[1,:], series_annotations=[:α,:β,:y,:δ], color=:gray)
and
scatter(m.means[2,:], m.variances[2,:], series_annotations=[:α,:β,:y,:δ], color=:gray)
For the Sobol method, we can similarly do:
m = gsa(f, Sobol(), param_range, samples=1000)
GlobalSensitivity.SobolResult{Matrix{Float64}, Nothing, Nothing, Nothing}([-0.0008389581111137501 -8.752702803115279e-5 0.5453489859004284 0.3542225683623781; 0.06407841546888673 0.5004449333190693 0.2356628015046419 0.06565860013591748], nothing, nothing, nothing, [0.00626573104376282 0.0034312794317180716 0.6315874448691761 0.4426611371958135; 0.13079977803485324 0.6194845380453204 0.2860119585836818 0.09457766010315302], nothing)
See the documentation for GlobalSensitivity for more details.
- 1Derived from https://docs.sciml.ai/GlobalSensitivity/stable/tutorials/parallelized_gsa/ from from https://github.com/SciML/SciMLDocs, MIT licensed, see repository for details.