Keras TimeDistributed input_shape mismatch

Question:

I’m trying to build a model with the TimeDistributed Dense layer, but I still get this error.

ValueError: `TimeDistributed` Layer should be passed an `input_shape` with at least 3 dimensions, received: (None, 16)

Am I missing something? The data has a format like the one provided in the snippet below. The model is simplified, but the error is the same.

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import LSTM, TimeDistributed, Dense
import numpy as np


data = np.random.random((100, 10, 32))
labels = np.random.randint(2, size=(100, 10, 1))

model = Sequential()

model.add(LSTM(16, input_shape=(10, 32)))
model.add(TimeDistributed(Dense(10, activation='sigmoid')))

model.compile(loss='binary_crossentropy', optimizer='adam')
model.fit(data, labels, epochs=10, batch_size=32)
Asked By: Mateusz Dorobek

||

Answers:

Try changing the number of units in your output layer to 1 and using return_sequences=True in your LSTM layer to get an output space for each timestep:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import LSTM, TimeDistributed, Dense
import numpy as np


data = np.random.random((100, 10, 32))
labels = np.random.randint(2, size=(100, 10, 1))

model = Sequential()

model.add(LSTM(16, input_shape=(10, 32), return_sequences=True))
model.add(TimeDistributed(Dense(1, activation='sigmoid')))

model.compile(loss='binary_crossentropy', optimizer='adam')
model.fit(data, labels, epochs=10, batch_size=32)
Answered By: AloneTogether
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.