iOS 15 - New JSONDecoding Decimals issue

Switching to Xcode 13 we got an issue in one of our unit-tests checking the proper conversion from JSON to our object model.

In our json we have some money value amounts like "{"amount": 90.21 }". Until now we parsed them as Decimal without any issues.

But now, we don't get a Decimal 90.21 anymore, we get Decimal 90.20999999...

This is reproducable only with iOS 15 Devices and values between 64 and 99 with 21 after the decimal point.

It looks like there is a conversion to double in between or so?

Answered by OOPer in 690564022

It looks like there is a conversion to double in between or so?

True. As you know, JSON is based on JavaScript Object Notation and numeric values in JavaScript are represented in 64-bit binary floating point -- Double. So, many JSON libraries might be using Double as an internal representation of JSON number.

So, you should better not rely on the result of conversion using Decimal even in Xcode 12.

Please try this:

import Foundation

struct DTO: Decodable {
    let amount: Decimal

    // Please code in the following lines and try it later...
//    enum CodingKeys: CodingKey {
//        case amount
//    }
//
//    init(from decoder: Decoder) throws {
//        let container = try decoder.container(keyedBy: CodingKeys.self)
//        let doubleValue = try container.decode(Double.self, forKey: .amount)
//        amount = Decimal(doubleValue)
//    }
}

var value = Decimal(string: "0.00")!
while value <= Decimal(string: "99.99")! {
    defer {value += Decimal(string: "0.01")!}
    let backendResponseText = """
    {"amount": \(value)}
    """
    let backendResponse = backendResponseText.data(using: .utf8)!
    print(backendResponseText)
    
    let decoded: DTO = try! JSONDecoder().decode(DTO.self, from: backendResponse)
    
    debugPrint(decoded.amount, value == decoded.amount)
}

Result (Xcode 12.5.1)

{"amount": 0}
0 true
{"amount": 0.01}
0.01 true
{"amount": 0.02}
0.02 true
{"amount": 0.03}
0.03 true
{"amount": 0.04}
0.04 true
{"amount": 0.05}
0.05 true
{"amount": 0.06}
0.06 true
{"amount": 0.07}
0.07000000000000001024 false
{"amount": 0.08}
0.08 true
{"amount": 0.09}
0.09 true
{"amount": 0.1}
0.1 true
...

Result (Xcode 13)

{"amount": 0}
0 true
{"amount": 0.01}
0.01 true
{"amount": 0.02}
0.02 true
{"amount": 0.03}
0.03 true
{"amount": 0.04}
0.04 true
{"amount": 0.05}
0.05 true
{"amount": 0.06}
0.06 true
{"amount": 0.07}
0.07000000000000001024 false
{"amount": 0.08}
0.08 true
{"amount": 0.09}
0.09 true
{"amount": 0.1}
0.1 true
...

There are some differences in details, but it shows that JSONDecoder has some cases which cannot decode values to Decimal as expected in both Xcode 12.5.1 and Xcode 13.0 .

You may need to choose a good workaround and test it with all possible values. Or avoid using JSON number and use JSON string to represent decimal values.

Can you share the relevant parts of your code (e.g. the model and the decoding technique), and a sample of your JSON?

Are you sure the problem is new to Xcode 13 ? There are discussions in Swift forum about this, not related to last Swift versions : https://forums.swift.org/t/parsing-decimal-values-from-json/6906

May be you could implement solution described here by defining your own extensions:

https://stackoverflow.com/questions/55131400/swift-decode-imprecise-decimal-correctly

moved to separate answer

struct DTO: Decodable {
  let amount: Decimal
}

let backendResponse = """
{"amount": 90.21}
""".data(using: .utf8)!


let decoded: DTO = try! JSONDecoder().decode(DTO.self, from: backendResponse)

debugPrint(decoded.amount)

 this results in "90.20999999999999" using the playground

Hmm, executing my unittests on older simulators 14.3 works, it just came up when running tests on iOS 15 simulators. (both with Xcode 13) our code didn't change for a year or so :D

Just ran my playground snippet with Xcode 12.5.1 and it worked there. (output -> 90.21)

Also interesting:

struct DTO: Decodable {
    let amount: Double
    var decimal: Decimal {
        Decimal(amount)
    }
}

amount and decimal both print as 9.21

Or (better) try this:

struct DTO: Decodable {
    let amount: Decimal
    
    enum CodingKeys: CodingKey {
        case amount
    }
    
    init(from decoder: Decoder) throws {
        let container = try decoder.container(keyedBy: CodingKeys.self)
        let doubleValue = try container.decode(Double.self, forKey: .amount)
        amount = Decimal(doubleValue)
    }
}

This is basically keeping it simple for the JSON... JSON can do Doubles...
...then we convert to the Decimal that we really want.

Plug-in this updated DTO to your project, and the rest of your code remains as-is.

Accepted Answer

It looks like there is a conversion to double in between or so?

True. As you know, JSON is based on JavaScript Object Notation and numeric values in JavaScript are represented in 64-bit binary floating point -- Double. So, many JSON libraries might be using Double as an internal representation of JSON number.

So, you should better not rely on the result of conversion using Decimal even in Xcode 12.

Please try this:

import Foundation

struct DTO: Decodable {
    let amount: Decimal

    // Please code in the following lines and try it later...
//    enum CodingKeys: CodingKey {
//        case amount
//    }
//
//    init(from decoder: Decoder) throws {
//        let container = try decoder.container(keyedBy: CodingKeys.self)
//        let doubleValue = try container.decode(Double.self, forKey: .amount)
//        amount = Decimal(doubleValue)
//    }
}

var value = Decimal(string: "0.00")!
while value <= Decimal(string: "99.99")! {
    defer {value += Decimal(string: "0.01")!}
    let backendResponseText = """
    {"amount": \(value)}
    """
    let backendResponse = backendResponseText.data(using: .utf8)!
    print(backendResponseText)
    
    let decoded: DTO = try! JSONDecoder().decode(DTO.self, from: backendResponse)
    
    debugPrint(decoded.amount, value == decoded.amount)
}

Result (Xcode 12.5.1)

{"amount": 0}
0 true
{"amount": 0.01}
0.01 true
{"amount": 0.02}
0.02 true
{"amount": 0.03}
0.03 true
{"amount": 0.04}
0.04 true
{"amount": 0.05}
0.05 true
{"amount": 0.06}
0.06 true
{"amount": 0.07}
0.07000000000000001024 false
{"amount": 0.08}
0.08 true
{"amount": 0.09}
0.09 true
{"amount": 0.1}
0.1 true
...

Result (Xcode 13)

{"amount": 0}
0 true
{"amount": 0.01}
0.01 true
{"amount": 0.02}
0.02 true
{"amount": 0.03}
0.03 true
{"amount": 0.04}
0.04 true
{"amount": 0.05}
0.05 true
{"amount": 0.06}
0.06 true
{"amount": 0.07}
0.07000000000000001024 false
{"amount": 0.08}
0.08 true
{"amount": 0.09}
0.09 true
{"amount": 0.1}
0.1 true
...

There are some differences in details, but it shows that JSONDecoder has some cases which cannot decode values to Decimal as expected in both Xcode 12.5.1 and Xcode 13.0 .

You may need to choose a good workaround and test it with all possible values. Or avoid using JSON number and use JSON string to represent decimal values.

iOS 15 - New JSONDecoding Decimals issue
 
 
Q