Surrogate model for [parameter vector] to [time series]


Say I have a function F that takes in a parameter vector P (say, a 5-element vector), and produces a (numerical) time series Y[t] of length T (eg T=100, so t=1,…,100). The function could be complicated (eg enzyme reaction models)

I want to make a neural network that predicts the output (Y[t]) that would result from feeding a new parameter set (P’) into the function. How can this be done?

A simple feed-forward network can work, but it requires a very large number of output nodes, and doesn’t take into account the temporal correlation / relationships between points. Is it possible/better to use a RNN or Transformer instead?

Asked By: Mich55



Using RNN might work for you. Here is some example code in Keras to get you started:

param_length = 5
time_length = 100

hidden_size = 20

model = tf.keras.Sequential([
    # Encode input parameters.
    tf.keras.layers.Dense(hidden_size, input_shape=[param_length]),

    # Generate a sequence.
    tf.keras.layers.LSTM(32, return_sequences=True),

model.compile(loss="mse", optimizer="nadam"), train_y, validation_data=(val_x, val_y), epochs=10)

The first Dense layer converts input parameters to a hidden state. Then LSTM RNN units generate time sequences. You will need to experiment with hyperparameters like the number of dense and LTSM layers, the size of hidden layers etc.

One more thing you can try is to use different loss function like:

  early_stopping_cb = tf.keras.callbacks.EarlyStopping(
      monitor="val_mae", patience=50, restore_best_weights=True)
  model.compile(loss=tf.keras.losses.Huber(), optimizer="nadam", metrics=["mae"])
  history =, train_y, validation_data=(val_x, val_y), epochs=500,
Answered By: Andreas Kaufmann
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.