Keras: how to change random number every epoch?

Question:

I am generating a random number in my model using

rand_int = tf.random.uniform((), 0, 2, dtype=tf.int32)

However, the random number does not change every epoch. How would I do this every epoch or even every batch if that is easier?

Edit:

Here is some more information on what I would like to do with the random number.

def random_func(X):

    if rand_int == 0:
        # Do something X
    if rand_int == 1:
        # Do something else to X

    return X

X = random_func(X)

Every epoch I would like to change X randomly, and so I want a different random number every epoch.

Asked By: PiccolMan

||

Answers:

You can use callbacks to call a function at end of each epoch (or batch), which generates a new random number each time. Read mode about callbacks and the options it provides here.

You can set xx as global inside the function.

  1. Rand_int is generate at end of each epoch
  2. Some conditions are run on rand_int and xx is update
  3. xx is a global variable and constantly updated at end of each epoch
  4. final value of xx is returned to the variable xx at end of training.
import tensorflow as tf
from tensorflow.keras import layers, Model, callbacks

xx = 0

class CustomCallback(callbacks.Callback):
    def on_epoch_end(self, epoch, logs=None):
        rand_int = tf.random.uniform((1,), 0, 1)
        global xx
        if rand_int < 0.5:
            xx = -4999
        if rand_int > 0.5:
            xx = 4999
        print(rand_int.numpy()[0], xx)

X, y = np.random.random((10,5)), np.random.random((10,))

inp = layers.Input((5,))
x = layers.Dense(3)(inp)
x = layers.Dense(3)(x)
out = layers.Dense(1)(x)

model = Model(inp, out)

model.compile(loss='MAE', optimizer='adam')
model.fit(X,y,callbacks=[CustomCallback()], epochs=3, verbose=1)

print('')
print('random function output, final state:',xx)
Epoch 1/3
1/1 [==============================] - 0s 248ms/step - loss: 1.6208
0.53797233 4999
Epoch 2/3
1/1 [==============================] - 0s 5ms/step - loss: 1.6057
0.64474905 4999
Epoch 3/3
1/1 [==============================] - 0s 3ms/step - loss: 1.5907
0.05995667 -4999

random function output, final state: -4999

As you can see, the rand_int, causes the xx to change value based on the function for each epoch. And the final state of xx is returned as well.

Answered By: Akshay Sehgal

So, it appears that this is not actually the expected behavior of tf.random.uniform. It should in fact generate a new random number each time. See this issue on tensorflow github: https://github.com/tensorflow/tensorflow/issues/36715

While some of the responses there recommend switching to using the newer Generator object for random number generation, in my case that did not fix it. The global variable approach was also not viable for my case.

What did fix it for me was encapsulating the code containing the random number generation within the call method of a a custom Layer, as mentioned in this comment: https://github.com/tensorflow/tensorflow/issues/36715#issuecomment-586349200

Note that this will result in a new random number every batch not every epoch, but based on your question that seems like that might be sufficient, and not require global variables and callbacks etc. In any case, I thought others might benefit

Answered By: Hunter