Post

Replies

Boosts

Views

Activity

Reply to Issues and Errors when runing tensorflow on GPU, but not CPU.
Hi Apple support, Checking to see if there any resolution to this problem? I have the same problem (see below for details) when training an LSTM-based seq2seq model with M3 PRO and tensorflow 2.15.1 and tensor flow-metal 1.1.0. The error message goes away if I upgrade to tensorflow 2.16 or 2.17 but the training accuracy goes down drastically so I had to switch back to 2.15. With tensorflow 2.15, training accuracy and inference are much better with CPU (i.e. uninstalling metal) than with GPU. But it takes 3x longer training with CPU. There is not benefit to using M3 PRO GPU.if it is faster but produces poor results. Probably the error message below is responsible for the poor results? 2024-08-06 15:27:22.619169: W tensorflow/core/common_runtime/type_inference.cc:339] Type inference failed. This indicates an invalid graph that escaped type checking. Error message: INVALID_ARGUMENT: expected compatible input types, but input 1: type_id: TFT_OPTIONAL args { type_id: TFT_PRODUCT args { type_id: TFT_TENSOR args { type_id: TFT_INT32 } } } is neither a subtype nor a supertype of the combined inputs preceding it: type_id: TFT_OPTIONAL args { type_id: TFT_PRODUCT args { type_id: TFT_TENSOR args { type_id: TFT_FLOAT } } } for Tuple type infernce function 0 while inferring type of node 'cond_36/output/_23'
Aug ’24