# Surrogatize

JuliaSim Surrogates provides a convenient wrapper function to *surrogatize* models. This wrapper allows users to apply various models to their `ExperimentData`

. When working with noisy data, a regularization coefficient can be provided as a keyword argument to combat overfitting.

When calling `surrogatize`

on a model, the result a callable surrogate object. This new surrogate object can be used to make inferences on new data. The convention for calling a surrogate object is similar to `solve`

; it requires initial conditions, parameter values, and a timespan. See a simplified example below.

```
# define the model
elm = ELM(2, 10)
# create the surrogate with regularization
surr = surrogatize(experiment_dat, elm; reg = 1e-5)
# call the surrogate
surr(initial_conditions, parameter_values, timespan)
```

`Surrogatize.surrogatize`

— Function` surrogatize(ed::ExperimentData, model; verbose = false, reg = 0.)`

Construct a surrogate for a given dataset using a model.

The dataset must be of the type `ExperimentData`

and the model is an instance of a surrogate model. These can be CTESN, ELM, AugmentedELM, etc.

The `surrogatize`

function call returns a surrogate object that may be used to infer on incoming data. The API to infer using the trained surrogate is of the form

`surr(u0, p, ts)`

Here, `u0`

is a vector representing the initial conditions, `p`

is a vector of parameters, and `ts`

represents the time steps to evaluate the surrogate on. If `ts`

is set to `nothing`

, the function evaluation returns the value using the natural time steps the solver chooses.

Keyword Arguments:

`verbose`

: Boolean to enable printing of additional logging information about the progress in training the surrogate model`reg`

: Sets the coefficient for L2 regularisation.`0.`

implies no regularisation is performed.

NOTE: The surrogates are only trained for autonomous dynamical models, and support for forced dynamical models is coming soon.

## Models

Various models can be used to create surrogate models and which one is best depends on the problem characteristics. The `Surrogatize`

module aims to ensure that applying any of the available models is easy for users so that they can determine which one best suits their particular problem.

### CTESN

Continuous-Time Echo State Network (CTESN) was designed by Julia Computing in order accelerate stiff, nonlinear systems^{[ctesn]}.

`Surrogatize.CTESN`

— Type```
CTESN(
RSIZE;
weight_initializer,
input_weight_initializer,
initial_condition_initializer,
alpha,
tau,
driver_sol,
activation,
solver,
solver_kwargs
) -> CTESN
```

Constructs an instance of a CTESN object.

It operates using the formula:

`((alpha - 1) * u) + σ.(W(u) + Win(sols(t)))`

`r`

is an integer to select the size of the reservoir used while training.

NOTE: default is to produce a **linear projection** CTESN.

**Keyword Arguments**

`weight_initializer`

: function that takes in 2 arguments and generates the`W`

matrix.`input_weight_initializer`

: function that takes in 2 arguments and generates the`Win`

matrix.`initial_condition_initializer`

: function that generates the fixed initial condition vector used by CTESN.`alpha`

: value determining extent to which example solutions are embedded into the dynamics of the reservoir.- Value of
`1.0`

results in a strong memory of the driving solution/signal. - Value of
`0.0`

results in a weak memory of the driving solution/signal. - Also referred to as
*leakage rate*.

- Value of
`tau`

: effective time scale of dynamical system.`driver_sol`

: NamedTuple of arguments used to generate the driver solution to train the CTESN.- All the keys as in the default configuration
*must*be present. - The keys are
`(:lb, :ub, :count, :order, :idxs)`

- All the keys as in the default configuration
`activation`

: nonlinearity shown as`σ`

in the formula.**Note**that it operates in an elementwise fashion.

`solver`

,`solver_kwargs`

: solver used to evolve the CTESN along with any keyword arguments.

### ELM

Extreme Learning Machine (ELM) is able to acheive good generalization performance and learn much faster than many traditional network architectures.

`Surrogatize.ELM`

— Type```
ELM([tanh ], in, hidden_size;
initW = randn,
initb = randn,
radial_function = linearRadial(),
solver = Rosenbrock23(),
solver_kwargs = (;))
```

Constructs an instance of the Extreme Learining Machine (ELM) where `in`

is an integer representing the input dimension and `hidden_size`

is an integer representing the size of the hidden features.

Optionally, one can provide the activation function as the first argument. Defaults to `tanh`

.

By default, the ELM model only operates on states and discards the controls and parameters. `AugmentedELM`

can be used for training on the controls and parameters in addition to the states.

**Keyword Arguments**

`initW`

: function used to generate the weights`initb`

: function used to generate the bias`solver`

: represents the solver chosen to evolve the trained surrogate with`radial_function`

: the radial distance metric used to generate the`RadialBasis`

hypernetwork`solver_kwargs`

: keyword arguments to be set alongside the chosen solver

`Surrogatize.AugmentedELM`

— Type```
AugmentedELM([tanh, ], in, hidden_size;
initW = randn,
initb = randn,
solver = Rosenbrock23(),
solver_kwargs = (;))
```

Constructs an instance of the Extreme Learining Machine (ELM) where `in`

is an integer representing in the input dimension and `hidden_size`

is an integer representing the size of the hidden dimension.

Optionally, one can provide the activation function as the first argument. Defults to `tanh`

.

It differs from the standard ELM by learning on the states as well as the parameters of a simulation.

NOTE: Since the AugmentedELM doesn't require a RadialBasis hypernetwork, it doesn't accept the `radial_function`

keyword argument.

**Keyword Arguments**

`initW`

: function used to generate the weights`initb`

: function used to generate the bias`solver`

: represents the solver chosen to evolve the trained surrogate with`solver_kwargs`

: keyword arguments to be set alongside the chosen solver

## Initializers

Initialization can have a major impact on the training and performance and of a surrogate model^{[init]}. For this reason, JuliaSim Surrogates provides various initializers to cover a broad range of use cases.

Missing docstring for `random_walk_initializer`

. Check Documenter's build log for details.

Missing docstring for `erdosrenyi_initializer`

. Check Documenter's build log for details.

Missing docstring for `asym_sprand_initializer`

. Check Documenter's build log for details.

Missing docstring for `qr_initializer`

. Check Documenter's build log for details.

Missing docstring for `negeigvals_initializer`

. Check Documenter's build log for details.