Objective-C, NSDecimalNumber imports into swift as a Double? not Decimal?

I work on an app with currency calculations based on some old Objective-C code that represents prices as NSDecimalNumber.

Recently Swift 3 changed to import NSNumber to a native number type such as Int or Double. It appears that NSDecimalNumber is now forcibly imported as Double, loosing all precision on our financial calculations. I have been unable to find a way to force swift into importing the number as a Decimal or NSDecimalNumber instead of a Double.

Is there any way to avoid having all my financial calculations becoming messed up due to rounding errors when working with both Objective-C and Swift or do I need to convert all my model objects from Objective-C to Swift in order to do so?

Replies

I am new to Swift, so please excuse if I am misinterpreting your post.


Can you post a short example?

Are there specific functions or initialization calls that are not working for you?


I am monkeying with a Swift 3 playground, and don't see a problem using NSDecimalNumber.

With these two lines:

var blarg: NSDecimalNumber = NSDecimalNumber(string: "1.2345678901234567890123456789012345678901234567890")
print(blarg)


I get the result:

1.23456789012345678901234567890123456789

The variable is keeping 40 digits of the original 50 digit number, but this matches what I expected from the documentation.

If I try to assign "blarg" to a variable of type Double, I get the error:

cannot convert value of type 'NSDecimalNumber' to specified type 'Double'

...so it doesn't seem like "NSDecimalNumber is now being forcibly imported as Double".

(Edit: I also put the variable into an Obj-C class, and then read it into a Swift NSDecimalNumber. Same results.)

Please show some code which can reproduce your issue.

Recently Swift 3 changed to import NSNumber to a native number type such as Int or Double.

That’s not my experience. In Xcode 8.1 I set up an Objective-C header like this:

@import Foundation;

NS_ASSUME_NONNULL_BEGIN

@interface Test : NSObject

@property (nonatomic, strong, readwrite) NSDecimalNumber * test1;
@property (nonatomic, strong, readwrite) NSNumber * test2;

@end 

NS_ASSUME_NONNULL_END

The interface seen by Swift is this:

import Foundation

open class Test : NSObject {

    open var test1: NSDecimalNumber

    open var test2: NSNumber
}

NSDecimalNumber and NSNumber come across as objects.

Share and Enjoy

Quinn “The Eskimo!”
Apple Developer Relations, Developer Technical Support, Core OS/Hardware

let myEmail = "eskimo" + "1" + "@apple.com"