This code causes a kernel restart. But, it runs fine if I take out the "recurrent_dropout" parameter in the LSTM layer or set it to zero.
inputs = keras.Input(shape=(sequence_length, raw_data.shape[-1]))
x = layers.LSTM(32, recurrent_dropout=0.25)(inputs)
x = layers.Dropout(0.5)(x)
outputs = layers.Dense(1)(x)
model = keras.Model(inputs, outputs)
callbacks = [ keras.callbacks.ModelCheckpoint("jena_lstm_dropout.keras",
save_best_only=True)
]
model.compile(optimizer="rmsprop", loss="mse", metrics=["mae"])
history = model.fit(train_dataset,
epochs=50,
validation_data=val_dataset,
callbacks=callbacks)
Code is straight from the book Deep Learning with Python and works in Google Colab.
Using MacOS 12.1, tensorflow-macos, metal plugin, Jupyterlab 3.2.8.
Jupyter server logs show:
2022-01-28 12:08:38.703448: I tensorflow/core/grappler/optimizers/custom_graph_optimizer_registry.cc:112] Plugin optimizer for device_type GPU is enabled.
[I 2022-01-28 12:08:41.871 ServerApp] AsyncIOLoopKernelRestarter: restarting kernel (1/5), keep random ports
[W 2022-01-28 12:08:41.872 ServerApp] kernel 7e7dc757-87dc-426a-8769-2e152e81d7b4 restarted
[W 2022-01-28 12:08:41.872 ServerApp] kernel 7e7dc757-87dc-426a-8769-2e152e81d7b4 restarted
[W 2022-01-28 12:08:41.873 ServerApp] kernel 7e7dc757-87dc-426a-8769-2e152e81d7b4 restarted