TensorFlow-Metal Breaks Random Initialization

Normally when a model is instantiated multiple times, it gets different weights each time (important for statistical significance testing). The current version of TF-Metal (0.4) seems to cache the model weights or something, resulting in the weights being equal for multiple model instantiations:

from tensorflow.keras import Sequential, layers

def get_model():
    model = Sequential()
    model.add(layers.Dense(5, activation='relu', input_shape=(4, 4)))
    return model

if __name__ == "__main__":
    model1 = get_model()
    model2 = get_model()
    print(model1.weights[0] - model2.weights[0])

Response without TF-Metal (as desired, weights for the two models are different):

tf.Tensor(
[[ 1.0636648  -0.10816181  0.8423695   1.3752697   0.38691664]
 [ 0.2402662   0.38139135 -0.19254395 -0.24551326  0.13166189]
 [-0.24854952  1.3374841   0.9716329  -0.21249878 -0.34604508]
 [ 0.5040202   0.120031    0.13515717 -0.40721053  0.29544616]], shape=(4, 5), dtype=float32)

Response with TF-Metal (weights are all equal):

tf.Tensor(
[[0. 0. 0. 0. 0.]
 [0. 0. 0. 0. 0.]
 [0. 0. 0. 0. 0.]
 [0. 0. 0. 0. 0.]], shape=(4, 5), dtype=float32)

Any ideas on the root cause or timeline for a fix?

Answered by Frameworks Engineer in 717135022

This fix has been included in tensorflow-metal==0.5.0

Accepted Answer

This fix has been included in tensorflow-metal==0.5.0

TensorFlow-Metal Breaks Random Initialization
 
 
Q