Hello, I have a pre-trained DNN that has eight binary output labels. It uses one output node with a Sigmoid activation function per output label. I want to use on-device personalisation to fine tune this DNN. Is that possible?
I have found, that only "Categorical Cross Entropy" is available as a loss function for updatable classifiers in CoreML, but the function set_categorical_cross_entropy_loss()
needs a Softmax layer as input and I have output layers with Sigmoid activation function (because the classifications are only binary).
Is there a way to use the DNN as it is or do I have to remove the last layer with Sigmoid activation and add two nodes and a Softmax layer instead (per output label)?