1 Reply
      Latest reply: Jan 13, 2017 2:09 AM by eskimo RSS
      eliog Level 1 Level 1 (0 points)

        I'm having an issue when I convert a double to a decimal:

         

        import Foundation
        var double:Double = 0.23
        var decimal = Decimal(double)
        print(double)
        print(decimal)
        

         

        Line 4 will print out 0.23 which is the value i've set.

        however, line 5 prints out "0.2300000000000000512"

         

        Why is this happening? I

         

        It works correctly if I do this:

         

        var double:Double = 0.23
        Decimal.init(string:double.description)
        
        • Re: Swift 3 , Double -> Decimal rounding error?
          eskimo Apple Staff Apple Staff (6,995 points)

          Binary floating point types, like Double, are unable to exactly represent many decimal numbers.  The reason Decimal exists is that it is able to exactly represent such numbers.  What you’re seeing here is the inexactness from Double being exactly propagated to Decimal.  It doesn’t show up when you go via a string because the double-to-string code rounds the value.

          How you should handle this issue depends on the context, so it’s hard to offer concrete suggestions without more info.

          Share and Enjoy

          Quinn “The Eskimo!”
          Apple Developer Relations, Developer Technical Support, Core OS/Hardware
          let myEmail = "eskimo" + "1" + "@apple.com"