Tensorflow-metal for M2 Max chip

Hi,

I have installed TensorFlow-metal in my MacBook Pro (38 GPU/32GB ram), and the time speed is acceptable when I run my experiment (transformer model using only encoder block):

Batches: 4 Data X training: (32, 1024. 512) EEG data of 1 subject Epochs: 200 Time? It takes around 40 minutes to finish But when I run my experiment:

Batches: 72 Data X training: (1024, 1024, 512) EEG data of 32 subjects Epochs: 1000 Time? After 6 hrs, it is still on epoch 1! Is it normal? Tensorflow-metal plugging is not taking the 38 GPU of the computer?

Hi @rick_licona!

Are you able to provide us with a small code snippet to reproduce this issue? You can replace the training data with just random arrays of the same shape in the code snippet since that would not affect the compute performance during training that we are interested in investigating here. Also let us know which version of tensorflow-macos and tensorflow-metal wheels you have installed on your machine and which MacOS version you are on?

In any case the training still being in the first epoch after 6 hrs sounds like there might be something going wrong there.

Tensorflow-metal for M2 Max chip
 
 
Q