ValueError: in user code: ValueError: No gradients provided for any variable for Autoencoder Tensorflow
Question:
I have the above code and I have the below error during my fit
ValueError: No gradients provided for any variable: ([‘dense_12/kernel:0’, ‘dense_12/bias:0’, ‘dense_13/kernel:0’, ‘dense_13/bias:0’],). Provided grads_and_vars
is ((None, <tf.Variable ‘dense_12/kernel:0’ shape=(784, 2) dtype=float32>), (None, <tf.Variable ‘dense_12/bias:0’ shape=(2,) dtype=float32>), (None, <tf.Variable ‘dense_13/kernel:0’ shape=(2, 784) dtype=float32>), (None, <tf.Variable ‘dense_13/bias:0’ shape=(784,) dtype=float32>)).
I tried to change encoder and decoder and It didn’t solve the problem. I have an autoencoder problem that is why i used X_train and X_test dataset instead of y_train and y_test dataset.
How can I solve this error?
Thank you for your feedbacks!
class autoencoder(Model):
def __init__(self, output_dim = 2):
super(autoencoder, self).__init__()
self.output_dim = output_dim
self.encoder = tf.keras.Sequential([
layers.Flatten(),
layers.Dense(2, activation='relu')
])
self.decoder = tf.keras.Sequential([
layers.Dense(784, activation='relu')
])
def encoder_call(self, inputs):
inputs = Flatten()(inputs)
return self.encoder(inputs)
def call(self, x, train = None):
x = Flatten()(x)
encoded = self.encoder(x)
decoded = self.decoder(encoded)
return tf.reshape(x,(-1,28,28))
model = autoencoder()
model.compile(optimizer='adam', loss=losses.MeanSquaredError())
model.fit(X_train, X_train,
epochs=1,
shuffle=True,
validation_data=(X_test, X_test)) ```
Answers:
return tf.reshape(x,(-1,28,28))
it should be return tf.reshape(decoded,(-1,28,28))
Cause of error: your input x is first passed through a Flatten
layer, then through encoder
and decoder
successively. Finally, call
method returns reshaped version of x, but not the decoded. So, there is no possibility to build a computational graph and backpropagate through it.
I have the above code and I have the below error during my fit
ValueError: No gradients provided for any variable: ([‘dense_12/kernel:0’, ‘dense_12/bias:0’, ‘dense_13/kernel:0’, ‘dense_13/bias:0’],). Provided grads_and_vars
is ((None, <tf.Variable ‘dense_12/kernel:0’ shape=(784, 2) dtype=float32>), (None, <tf.Variable ‘dense_12/bias:0’ shape=(2,) dtype=float32>), (None, <tf.Variable ‘dense_13/kernel:0’ shape=(2, 784) dtype=float32>), (None, <tf.Variable ‘dense_13/bias:0’ shape=(784,) dtype=float32>)).
I tried to change encoder and decoder and It didn’t solve the problem. I have an autoencoder problem that is why i used X_train and X_test dataset instead of y_train and y_test dataset.
How can I solve this error?
Thank you for your feedbacks!
class autoencoder(Model):
def __init__(self, output_dim = 2):
super(autoencoder, self).__init__()
self.output_dim = output_dim
self.encoder = tf.keras.Sequential([
layers.Flatten(),
layers.Dense(2, activation='relu')
])
self.decoder = tf.keras.Sequential([
layers.Dense(784, activation='relu')
])
def encoder_call(self, inputs):
inputs = Flatten()(inputs)
return self.encoder(inputs)
def call(self, x, train = None):
x = Flatten()(x)
encoded = self.encoder(x)
decoded = self.decoder(encoded)
return tf.reshape(x,(-1,28,28))
model = autoencoder()
model.compile(optimizer='adam', loss=losses.MeanSquaredError())
model.fit(X_train, X_train,
epochs=1,
shuffle=True,
validation_data=(X_test, X_test)) ```
return tf.reshape(x,(-1,28,28))
it should be return tf.reshape(decoded,(-1,28,28))
Cause of error: your input x is first passed through a Flatten
layer, then through encoder
and decoder
successively. Finally, call
method returns reshaped version of x, but not the decoded. So, there is no possibility to build a computational graph and backpropagate through it.