Precision issue with 'Double' data type in Swift

I'm working on an application where users can create their own equations using different math functions, like, sqrt, log, pow, exp etc. and app calculates them with the provided values and displays the result. User also has the option to select the number of digits they want to round the result to. I'm using 'Double' data type and having precision issues with some of the equation results. Below are some examples:

Example1:

let a = 0.2

let b = 0.1

let result = a + b

print(result)

Printed result: 0.30000000000000004

Expected result: 0.3

Example2:

let a = 9.495

let b = 0.2

let result = a + b

print(result)

Printed result: 9.694999999999999

Expected result: 9.695

Once they calculate the equation there is a requirement to match the result with the provided text. Which in the above cases fails.

One of the solutions is to use the “Decimal” data type, but the math functions like square root and log are not available for Decimal data type. Please let me know if there is any way to resolve these issue by continuing to use “Double”. Or please help in writing custom implementation for all math functions for ‘Decimal’, or suggest some third party tool if there is one.

Thanks in Advance.

Answered by MobileTen in 754551022

Here you go:

let a = 9.495
let b = 0.2
let result = a + b
let formatted = String(format: "%0.3f", result)

if localization of the numbers are required then make use of the locale parameter:

let formatted = String(format: "%0.3f", locale: Locale.current, result)

Sean!

Here you go:

let a = 9.495
let b = 0.2
let result = a + b
let formatted = String(format: "%0.3f", result)

if localization of the numbers are required then make use of the locale parameter:

let formatted = String(format: "%0.3f", locale: Locale.current, result)

Sean!

Precision issue with 'Double' data type in Swift
 
 
Q