# Surrogate model for [parameter vector] to [time series]

## Question:

Say I have a function **F** that takes in a parameter vector **P** (say, a 5-element vector), and produces a (numerical) time series **Y[t]** of length *T* (eg *T*=100, so *t*=1,…,100). The function could be complicated (eg enzyme reaction models)

**I want to make a neural network that predicts the output (Y[t]) that would result from feeding a new parameter set (P’) into the function. How can this be done?**

A simple feed-forward network can work, but it requires a very large number of output nodes, and doesn’t take into account the temporal correlation / relationships between points. Is it possible/better to use a RNN or Transformer instead?

## Answers:

Using RNN might work for you. Here is some example code in Keras to get you started:

```
param_length = 5
time_length = 100
hidden_size = 20
model = tf.keras.Sequential([
# Encode input parameters.
tf.keras.layers.Dense(hidden_size, input_shape=[param_length]),
# Generate a sequence.
tf.keras.layers.RepeatVector(time_length),
tf.keras.layers.LSTM(32, return_sequences=True),
tf.keras.layers.TimeDistributed(tf.keras.layers.Dense(1))
])
model.compile(loss="mse", optimizer="nadam")
model.fit(train_x, train_y, validation_data=(val_x, val_y), epochs=10)
```

The first Dense layer converts input parameters to a hidden state. Then LSTM RNN units generate time sequences. You will need to experiment with hyperparameters like the number of dense and LTSM layers, the size of hidden layers etc.

One more thing you can try is to use different loss function like:

```
early_stopping_cb = tf.keras.callbacks.EarlyStopping(
monitor="val_mae", patience=50, restore_best_weights=True)
model.compile(loss=tf.keras.losses.Huber(), optimizer="nadam", metrics=["mae"])
history = model.fit(train_x, train_y, validation_data=(val_x, val_y), epochs=500,
callbacks=[early_stopping_cb])
```