How to Store 1.66 in Nsdecimalnumber

How to store 1.66 in NSDecimalNumber

In

let number:NSDecimalNumber = 1.66

the right-hand side is a floating point number which cannot represent
the value "1.66" exactly. One option is to create the decimal number
from a string:

let number = NSDecimalNumber(string: "1.66")
print(number) // 1.66

Another option is to use arithmetic:

let number = NSDecimalNumber(value: 166).dividing(by: 100)
print(number) // 1.66

With Swift 3 you may consider to use the "overlay value type" Decimal instead, e.g.

let num = Decimal(166)/Decimal(100)
print(num) // 1.66

Yet another option:

let num = Decimal(sign: .plus, exponent: -2, significand: 166)
print(num) // 1.66

Addendum:

Related discussions in the Swift forum:

  • Exact NSDecimalNumber via literal
  • ExpressibleByFractionLiteral

Related bug reports:

  • SR-3317
    Literal protocol for decimal literals should support precise decimal accuracy, closed as a duplicate of
  • SR-920
    Re-design builtin compiler protocols for literal convertible types.

NSDecimalNumber usage for precision with currency in Swift

The problem is that you’re effectively doing floating point math (with the problems it has faithfully capturing fractional decimal values in a Double) and creating a Decimal (or NSDecimalNumber) from the Double value that already has introduced this discrepancy. Instead, you want to create your Decimal values before doing your division (or before having a fractional Double value, even if a literal).

So, the following is equivalent to your example, whereby it is building a Double representation (with the limitations that entails) of 0.07, and you end up with a value that is not exactly 0.07:

let value = Decimal(7.0 / 100.0)                                // or NSDecimalNumber(value: 7.0 / 100.0)

Whereas this does not suffer this problem because we are dividing a decimal 7 by a decimal 100:

let value = Decimal(7) / Decimal(100)                           // or NSDecimalNumber(value: 7).dividing(by: 100) 

Or, other ways to create 0.07 value but avoiding Double in the process include using strings:

let value = Decimal(string: "0.07")                             // or NSDecimalNumber(string: "0.07")

Or specifying the mantissa/significant and exponent:

let value = Decimal(sign: .plus, exponent: -2, significand: 7)  // or NSDecimalNumber(mantissa: 7, exponent: -2, isNegative: false)

Bottom line, avoid Double representations entirely when using Decimal (or NSDecimalNumber), and you won't suffer the problem you described.

Understanding NSDecimalRound Behaviour

The problem comes from the literal which is Double and which has to be converted to a Decimal. So the error comes from the conversion:

var unrounded: Decimal = 2341.2143
// ^-- decimal var ^-- Double literal

As you know, the floating point encoding is error prone. You can check here that:

2341.2143 is encoded as 2341.21430000000009385985322296619415283203125  
rounded down makes 2341.2143
2341.2142 is encoded as 2341.21419999999989158823154866695404052734375
rounded down makes 2341.2141

You need to initialize the Decimal to benefit from the superior precision. Unfortunately, this is not intuitive. You can for example use, the following, which behaves as expected:

var unrounded3: Decimal = Decimal(sign:.plus, exponent:-4, significand:23412142)
var rounded3 = Decimal()
NSDecimalRound(&rounded3, &unrounded3, 4, .down)
print(rounded3)

How to round an NSDecimalNumber in swift?

you can do it like that

let x = 5
let y = 2
let total = x.decimalNumberByDividingBy(y).decimalNumberByRoundingAccordingToBehavior( NSDecimalNumberHandler(roundingMode: NSRoundingMode.RoundUp, scale: 0, raiseOnExactness: false, raiseOnOverflow: false, raiseOnUnderflow: false, raiseOnDivideByZero: false))

How can I initialize Decimal without losing precision in Swift

The problem is that all floating point literals are inferred to have type Double, which results in a loss of precision. Unfortunately Swift can't initialise floating point literals to Decimal directly.

If you want to keep precision, you need to initialise Decimal from a String literal rather than a floating point literal.

let decimalA = Decimal(string: "3.24")!
let double = 3.24
let decimalC: Decimal = 3.0 + 0.2 + 0.04
print(decimalA) // Prints 3.24
print(double) // Prints 3.24
print(decimalC) // Prints 3.24

Bear in mind this issue only happens with floating point literals, so if your floating point numbers are generated/parsed in runtime (such as reading from a file or parsing JSON), you shouldn't face the precision loss issue.



Related Topics



Leave a reply



Submit