Cannot generate random ints inside TensorFlow2.0 "call" function

Question:

I am trying to implement a custom dropout layer. In this dropout layer I want to generate a random number and turn on/off that output. The idea is simple and I thought the implementation would be too.

I have tried using the regular ‘random’ python function, this just creates the same output over and over. Ive tried using tf.random.uniform() but this causes an error when trying to index into the numpy array.

NotImplementedError: Cannot convert a symbolic tf.Tensor (add:0) to a numpy array. This error may indicate that you’re trying to pass a Tensor to a NumPy call, which is not supported.

Ive tried extracting the value out of the tensor so then I could index into the array, but casting to int does nothing, its type stays as tensor. BUT when I run the code just in jupyter notebook and not in an actual layer, it works fine and how I want.

def call(self, inputs, *args, **kwargs):
    if isinstance(self.level_size, numbers.Real) and self.level_size == 0:
        return tf.identity(inputs)

    return tf.math.multiply(tf.identity(inputs), self.create_mask(inputs[0]))

@tf.function    #<== I dont know what this does. Ive tried adding and removing it
def create_mask(self, input_tensor):
    self.seed += 1
    self.rand.seed(self.seed)          #<== Regular python random
    arr = np.full(input_tensor.get_shape().as_list()[0], 0, dtype='float32')
    for i in range(0, len(arr), self.level_size):
        temp = self.rand.randint(0, self.level_size - 1)
        # temp = tf.random.uniform(shape=(), minval=0, maxval=self.level_size, dtype='int32')
        arr[i + temp] = self.scalar if not self.use_all_nodes else 1
        # arr[i + int(temp)] = self.scalar if not self.use_all_nodes else 1
    tf.print(arr)
    arr = arr[None, :]
    return tf.convert_to_tensor(arr)
Asked By: tensorDam

||

Answers:

Here is my solution. It is just one option. There is no distinct function in tensorflow to create random integers, so I used tf.random.categorical. I did not test the speed, just wanted to get a solution.

NOTE: my input tensor is 2D with a shape of (num_samples, num_features). For your application you might need a tensor with shape (batch_size, num_samples, num_features) but you should be able to change the code accordingly.

Here is my code:

class SliceMasker:
    def __init__(self, num_samples, num_features, step_size):
        self.pseudo_logits = tf.ones(shape=(1, step_size))
        self.num_masked = num_samples//step_size
        self.offsets = tf.expand_dims(
            tf.range(self.num_masked) * step_size, axis=0)
        self.mask = tf.zeros((self.num_masked, num_features))

    def mask_slices(self, tensor):
        slice_ids = tf.random.categorical(
            self.pseudo_logits, 4, dtype=tf.int32)
        slice_ids = tf.reshape(slice_ids + self.offsets, (self.num_masked, 1))
        return tf.tensor_scatter_nd_update(tensor, slice_ids, self.mask)


num_samples = 20
num_features = 3
step_size = 5

sm = SliceMasker(num_samples, num_features, step_size)


example = tf.random.normal(shape=(num_samples, num_features))

print("before masking:")
print(example.numpy())
masked = sm.mask_slices(example)
print("after masking:")
print(masked.numpy())

Output:

before masking:
[[ 0.2977529  -1.0069662   0.18448827]
 [ 1.4570498   0.416826   -1.9352742 ]
 [ 0.1615839  -0.0706018   0.79011333]
 [ 0.1402558  -1.285141   -0.2618603 ]
 [ 2.360426   -0.5306141   0.02061797]
 [ 0.6496891   0.02275689 -0.10618226]
 [ 0.5680644  -0.5271083   1.4394306 ]
 [-0.62021923 -0.49577036 -0.9902582 ]
 [-0.47593793  1.0605363  -0.13049011]
 [ 1.4284478   0.43213463 -0.55216306]
 [ 0.32337463  1.4737866   0.21306337]
 [-1.3215414   0.47005925 -0.7038767 ]
 [-1.2404966   1.548273   -0.77220184]
 [-1.0531787   0.17857324 -0.28259423]
 [ 0.30251816  0.2874606   0.31080458]
 [-0.4658653  -0.4285968   1.1454991 ]
 [-0.87822616 -1.8857303  -0.3963113 ]
 [ 0.2780536  -0.6733958   0.11874776]
 [-3.01591    -0.7286207  -1.9439988 ]
 [-0.01809683 -0.38306755 -1.4614321 ]]
after masking:
[[ 0.2977529  -1.0069662   0.18448827]
 [ 0.          0.          0.        ]
 [ 0.1615839  -0.0706018   0.79011333]
 [ 0.1402558  -1.285141   -0.2618603 ]
 [ 2.360426   -0.5306141   0.02061797]
 [ 0.6496891   0.02275689 -0.10618226]
 [ 0.          0.          0.        ]
 [-0.62021923 -0.49577036 -0.9902582 ]
 [-0.47593793  1.0605363  -0.13049011]
 [ 1.4284478   0.43213463 -0.55216306]
 [ 0.          0.          0.        ]
 [-1.3215414   0.47005925 -0.7038767 ]
 [-1.2404966   1.548273   -0.77220184]
 [-1.0531787   0.17857324 -0.28259423]
 [ 0.30251816  0.2874606   0.31080458]
 [-0.4658653  -0.4285968   1.1454991 ]
 [-0.87822616 -1.8857303  -0.3963113 ]
 [ 0.2780536  -0.6733958   0.11874776]
 [-3.01591    -0.7286207  -1.9439988 ]
 [ 0.          0.          0.        ]]

I hope, that this is the functionality you desired. Let me know, if it fits. If not, I will try to change it.

Answered By: Blindschleiche