Bug in NumberFormatter..?

Please tell me if I'm missing something here - run these three lines of code in Playground:


import UIKit

let numberFormatter = NumberFormatter()

let myformattedNumber="\(numberFormatter.string(from: NSNumber(value:10660066111620287))!)"


gives me this output for myformattedNumber:


"10660066111620288"


See the difference of 7 vs 8. Anybody can tell me what is going on here..?


(I found that problem in my app and first thought I had a bug in my algorithm, but the above is where the problem sits. I do need the formatting - related code ommitted here so as to not distract from the issue, so please don't suggest to just not use the formatter 🙂)

You are asking for 16 digit accuracy. That's a pretty big number to handle in a 64 bit machine, maybe too high. Once you need that accuracy you are subject to the whims of the compiler. Try creating the NSNumber specifying that it is an NSInteger. Try dividing the calculation into two pieces; one operating on the top half and the other operating on the bottom half. You can get the top half with bigNumber%100000000 and the bottom half with bigNumber/100000000 but you can't do that reliably with a 16 digit number.

I played around with this a little, and the value 10660066111620287 is only a 28-bit number (approx. DDD0000 in hex), so it should be able to be stored exactly, unless for some reason it's being stored in a 32-bit float internally, which only has 24 bits of precision. Interestingly, the value 106600661116202871 (that is, the original value multiplied by 10 with 1 added) is displayed correctly by the number formatter even though it's bigger.


So, there is something odd going on here, and I don't know what. A bug report is probably the next step, with some workaround of displaying the value in 2 parts, as you suggested, in the interim.

I don't think I'm asking for 16 digit accuracy, as the number I tried this with is an integer, no decimal fractions involved.


Anyway, I found this discussion on https://stackoverflow.com/questions/42046106/nsnumberformatter-stringfrom-maximum-possible-value-exceeded-does-not-result) which appears to suggest that NSNumber can't handle numbers > 2^54 properly.


Meanwhile I found out that NSLocalizedString doesn't have that issue, so I'm using it as a workaround in order to avoid adding the thousand separators myself (which is not a big problem, but likely slower). So it looks like at least Apple's localization folks got that fixed - good for them (and for me 🙂)

>I don't think I'm asking for 16 digit accuracy, as the number I tried this with is an integer, no decimal fractions involved.


The difference between the 7 and the 8 is one part in 10^16 so you are asking for that level of accuracy.

Apple - this is still an issue, I just tried it again (in XC 10.1).

>just tried it again (in XC 10.1).


Xcode 10.1 was released the date this thread started, so not sure what you mean by 'still', unless you meant today's 10.2 beta?

I ran a little test. The maximum number that NSFormatter can handle is 9007199254740992 or 0x20000000000000, which just happens to be the largest number a double can accurately represent. Apparently Javascript has a MAX_SAFE_INTEGER value for this.

Please try .string(for: Any?) method with the Decimal type and voila.

Code Block
let myFormattedNumber = numberFormatter.string(for: Decimal(10660066111620287))

Bug in NumberFormatter..?
 
 
Q