This model has not yet been built error on model.summary()

Question:

I’ve keras model defined as follow

class ConvLayer(Layer) :
    def __init__(self, nf, ks=3, s=2, **kwargs):
        self.nf = nf
        self.grelu = GeneralReLU(leak=0.01)
        self.conv = (Conv2D(filters     = nf,
                            kernel_size = ks,
                            strides     = s,
                            padding     = "same",
                            use_bias    = False,
                            activation  = "linear"))
        super(ConvLayer, self).__init__(**kwargs)

    def rsub(self): return -self.grelu.sub
    def set_sub(self, v): self.grelu.sub = -v
    def conv_weights(self): return self.conv.weight[0]

    def build(self, input_shape):
        # No weight to train.
        super(ConvLayer, self).build(input_shape)  # Be sure to call this at the end

    def compute_output_shape(self, input_shape):
        output_shape = (input_shape[0],
                        input_shape[1]/2,
                        input_shape[2]/2,
                        self.nf)
        return output_shape

    def call(self, x):
        return self.grelu(self.conv(x))

    def __repr__(self):
        return f'ConvLayer(nf={self.nf}, activation={self.grelu})'
class ConvModel(tf.keras.Model):
    def __init__(self, nfs, input_shape, output_shape, use_bn=False, use_dp=False):
        super(ConvModel, self).__init__(name='mlp')
        self.use_bn = use_bn
        self.use_dp = use_dp
        self.num_classes = num_classes

        # backbone layers
        self.convs = [ConvLayer(nfs[0], s=1, input_shape=input_shape)]
        self.convs += [ConvLayer(nf) for nf in nfs[1:]]
        # classification layers
        self.convs.append(AveragePooling2D())
        self.convs.append(Dense(output_shape, activation='softmax'))

    def call(self, inputs):
        for layer in self.convs: inputs = layer(inputs)
        return inputs

I’m able to compile this model without any issues

>>> model.compile(optimizer=tf.keras.optimizers.Adam(lr=lr), 
              loss='categorical_crossentropy',
              metrics=['accuracy'])

But when I query the summary for this model, I see this error

>>> model = ConvModel(nfs, input_shape=(32, 32, 3), output_shape=num_classes)
>>> model.summary()
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-220-5f15418b3570> in <module>()
----> 1 model.summary()

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/network.py in summary(self, line_length, positions, print_fn)
   1575     """
   1576     if not self.built:
-> 1577       raise ValueError('This model has not yet been built. '
   1578                        'Build the model first by calling `build()` or calling '
   1579                        '`fit()` with some data, or specify '

ValueError: This model has not yet been built. Build the model first by calling `build()` or calling `fit()` with some data, or specify an `input_shape` argument in the first layer(s) for automatic build.

I’m providing input_shape for the first layer of my model, why is throwing this error?

Asked By: bachr

||

Answers:

The error says what to do:

This model has not yet been built. Build the model first by calling build()

model.build(input_shape) # `input_shape` is the shape of the input data
                         # e.g. input_shape = (None, 32, 32, 3)
model.summary()
Answered By: Vlad

Another method is to add the attribute input_shape() like this:

model = Sequential()
model.add(Bidirectional(LSTM(n_hidden,return_sequences=False, dropout=0.25, 
recurrent_dropout=0.1),input_shape=(n_steps,dim_input)))
Answered By: Rxma
# X is a train dataset with features excluding a target variable

input_shape = X.shape  
model.build(input_shape) 
model.summary()
Answered By: B. Kanani

There is a very big difference between keras subclassed model and other keras models (Sequential and Functional).

Sequential models and Functional models are datastructures that represent a DAG of layers. In simple words, Functional or Sequential model are static graphs of layers built by stacking one on top of each other like LEGO. So when you provide input_shape to first layer, these (Functional and Sequential) models can infer shape of all other layers and build a model. Then you can print input/output shapes using model.summary().

On the other hand, subclassed model is defined via the body (a call method) of Python code. For subclassed model, there is no graph of layers here. We cannot know how layers are connected to each other (because that’s defined in the body of call, not as an explicit data structure), so we cannot infer input / output shapes. So for a subclass model, the input/output shape is unknown to us until it is first tested with proper data. In the compile() method, we will do a deferred compile and wait for a proper data. In order for it to infer shape of intermediate layers, we need to run with a proper data and then use model.summary(). Without running the model with a data, it will throw an error as you noticed. Please check GitHub gist for complete code.

The following is an example from Tensorflow website.

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers

class ThreeLayerMLP(keras.Model):

  def __init__(self, name=None):
    super(ThreeLayerMLP, self).__init__(name=name)
    self.dense_1 = layers.Dense(64, activation='relu', name='dense_1')
    self.dense_2 = layers.Dense(64, activation='relu', name='dense_2')
    self.pred_layer = layers.Dense(10, name='predictions')

  def call(self, inputs):
    x = self.dense_1(inputs)
    x = self.dense_2(x)
    return self.pred_layer(x)

def get_model():
  return ThreeLayerMLP(name='3_layer_mlp')

model = get_model()

(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()
x_train = x_train.reshape(60000, 784).astype('float32') / 255
x_test = x_test.reshape(10000, 784).astype('float32') / 255

model.compile(loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              optimizer=keras.optimizers.RMSprop())

model.summary() # This will throw an error as follows
# ValueError: This model has not yet been built. Build the model first by calling `build()` or calling `fit()` with some data, or specify an `input_shape` argument in the first layer(s) for automatic build.

# Need to run with real data to infer shape of different layers
history = model.fit(x_train, y_train,
                    batch_size=64,
                    epochs=1)

model.summary()

Thanks!

Make sure you create your model properly. A small typo mistake like the following code may also cause a problem:

model = Model(some-input, some-output, "model-name")

while the correct code should be:

model = Model(some-input, some-output, name="model-name")
Answered By: Thanh Le

If your Tensorflow, Keras version is 2.5.0 then just add Tensorflow when you import Keras package

Not this:

from tensorflow import keras
from keras.models import Sequential
import tensorflow as tf

Like this:

from tensorflow import keras
from tensorflow.keras.models import Sequential
import tensorflow as tf
Answered By: Hsw Hesure

recently changed versions accept
from tensorflow.python.keras.models import Model

I was also facing same error, so I have removed model.summary(). Then issue is resolved. As it arises if model of summary is defined before the model is built.

Here is the LINK for description which states that

Raises:
    ValueError: if `summary()` is called before the model is built.**
Answered By: TariqS

Version issues of your Tensorflow, Keras, can be the reason for this.

Same problem I encountered during training for the LSTM model for regression.

Error:

ValueError: This model has not yet been built. Build the model first
by calling build() or by calling the model on a batch of data.

Earlier:

from tensorflow.keras.models import Sequential

from tensorflow.python.keras.models import Sequential

Corrected:

from keras.models import Sequential
Answered By: Saurav Kumar