Switching to Xcode 13 we got an issue in one of our unit-tests checking the proper conversion from JSON to our object model.
In our json we have some money value amounts like "{"amount": 90.21 }". Until now we parsed them as Decimal without any issues.
But now, we don't get a Decimal 90.21 anymore, we get Decimal 90.20999999...
This is reproducable only with iOS 15 Devices and values between 64 and 99 with 21 after the decimal point.
It looks like there is a conversion to double in between or so?
It looks like there is a conversion to double in between or so?
True. As you know, JSON is based on JavaScript Object Notation and numeric values in JavaScript are represented in 64-bit binary floating point -- Double
. So, many JSON libraries might be using Double
as an internal representation of JSON number.
So, you should better not rely on the result of conversion using Decimal
even in Xcode 12.
Please try this:
import Foundation
struct DTO: Decodable {
let amount: Decimal
// Please code in the following lines and try it later...
// enum CodingKeys: CodingKey {
// case amount
// }
//
// init(from decoder: Decoder) throws {
// let container = try decoder.container(keyedBy: CodingKeys.self)
// let doubleValue = try container.decode(Double.self, forKey: .amount)
// amount = Decimal(doubleValue)
// }
}
var value = Decimal(string: "0.00")!
while value <= Decimal(string: "99.99")! {
defer {value += Decimal(string: "0.01")!}
let backendResponseText = """
{"amount": \(value)}
"""
let backendResponse = backendResponseText.data(using: .utf8)!
print(backendResponseText)
let decoded: DTO = try! JSONDecoder().decode(DTO.self, from: backendResponse)
debugPrint(decoded.amount, value == decoded.amount)
}
Result (Xcode 12.5.1)
{"amount": 0}
0 true
{"amount": 0.01}
0.01 true
{"amount": 0.02}
0.02 true
{"amount": 0.03}
0.03 true
{"amount": 0.04}
0.04 true
{"amount": 0.05}
0.05 true
{"amount": 0.06}
0.06 true
{"amount": 0.07}
0.07000000000000001024 false
{"amount": 0.08}
0.08 true
{"amount": 0.09}
0.09 true
{"amount": 0.1}
0.1 true
...
Result (Xcode 13)
{"amount": 0}
0 true
{"amount": 0.01}
0.01 true
{"amount": 0.02}
0.02 true
{"amount": 0.03}
0.03 true
{"amount": 0.04}
0.04 true
{"amount": 0.05}
0.05 true
{"amount": 0.06}
0.06 true
{"amount": 0.07}
0.07000000000000001024 false
{"amount": 0.08}
0.08 true
{"amount": 0.09}
0.09 true
{"amount": 0.1}
0.1 true
...
There are some differences in details, but it shows that JSONDecoder
has some cases which cannot decode values to Decimal
as expected in both Xcode 12.5.1 and Xcode 13.0 .
You may need to choose a good workaround and test it with all possible values. Or avoid using JSON number and use JSON string to represent decimal values.