I was inspired by Swift going open source with the prospect of Swift's Foundation library becoming platform neutral, so I took a look at one of my flagship apps, and contemplating making portions of that open source as well. Though I started nobly, I became dismayed.
I fell into a kind of Double vs. CGFloat "****" that I had been conveniently avoiding until now.
First, I refactored a ton of shipping code into a framework (you know, because iOS 8, Apple Watch, etc.), and that framework is capable of being mostly platform neutral, like Swift itself is becoming, with just a small shim for each platform. Platforms for the moment being iOS, OS X, watchOS and tvOS. But now I have Linux in my sights, with only a dependency on Foundation. I have been able to sort-of factor out UIFont, UIColor, and the like, into little, separate extensions that complete the circle for the existing platforms. But what I haven't been able to do is reconcile the floating point type for the new platform. The default in Swift is Double, which is great, because it's the same as CGFloat.
Except that it isn't.
Here's the trap I fell into. Initially, to convert to Swift faster, I polluted my framework source code with CGFloat in order to eliminate conversion ****, but that requires CoreGraphics everywhere, which is not part of Foundation. And now faced with the prospect of removing CoreGraphics, I see myself adding one conversion function after another.
This feels wrong.
Apple engineers and third party developers:
What is the general guidance for converting between the default Swift Double floating point type and CGFloat? Am I resigned to implementing converters and type coercions for every little thing?