Hey, I was having the same issue with in converting a pytorch model over (softmaxND). I think I found a work around that may or may not work for you. I'm not a keras user so apologies for my lack of knowledge of that framework. In pytorch is is encouraged to not actually add the softmax layer at the end-- and rather use one of the built-in loss functions which apply it as part of the loss function evaluation for numerical reasons. So it is actually better for me if my original torch model doesn't include that layer. Either way if you can peel that softmax layer off, then you can just add it back in to make coremltools happy.
Here is a snippet of my function.
def make_updateable(mlmodel):
model_spec = mlmodel.get_spec()
builder = ct.models.neural_network.NeuralNetworkBuilder(spec=model_spec, mode='classifier')
# Manually add a softmax at the end
builder.add_softmax(name='output_prob', input_name='output', output_name='output_prob')
# Your update layer names.....
builder.make_updatable(update_layer_names)
print(builder.inspect_layers())
print(builder.layers)
print(builder.inspect_output_features())
builder.set_sgd_optimizer(SgdParams(lr=0.01, batch=10))
builder.set_epochs(10)
feature = ('output', datatypes.Array(1))
builder.set_categorical_cross_entropy_loss(name='lossLayer', input='output_prob')
# Save the CoreML model
updateable_model = ct.models.MLModel(model_spec)
updateable_model.save('classifier_updateable.mlmodel')
Hope that helps.