How to re-create a Decimal with a string originally from Decimal.description (must be locale independent)?

// from a Decimal
let d: Decimal = 4444444.5555555
let s = d.description
// now how to take `s` to create the same Decimal back?
let d2 = Decimal(....)  // which init to use?
// I don't think
// init?(string: String, locale: Locale? = nil)
// is correct as it takes a `Locale`
// because Decimal.description is locale independent: decimal point is alway "."

The reason I ask this is because I want to store Decimal with @AppStorage by conforming it to RawRepresentable, RawValue == String. And would like the persistent string to be locale independent going out and coming back. So even if the user change its locale, thing still work.

I believe Decimal.description is locale independent: for a value like 4444444.5555555, it always produce 4444444.5555555, never 4444444,5555555 even if the user's locale use , as decimal point.

But Decimal.init(string:locale) use the locale to parse the string, so it could use a , as decimal point this will incorrectly parse? I don't want that.

One way to do this:

let d2 = Decimal(Double(s)!)

will this lose precision?

Ok, just force a locale that match what .description use:

let d2 = Decimal(string: s, locale: Locale(identifier: "en_US"))  // I think "en_US" does the job?

I assume that Decimal(string: String, locale: Locale? = nil) uses the system locale if locale is nil, similar as other methods such as compare(_:options:range:locale:), so that would be a suitable counterpart for description.

It worked in my test:

print(Locale.current.decimalSeparator!) // ,

let d: Decimal = 4444444.5555555
let s = d.description
print(s) // 4444444.555555500032
let d2 = Decimal(string: s)!
print(d, d2) // 4444444.555555500032 4444444.555555500032
print(d == d2) // true

Although the decimal separator of the current locale is a comma, the description string (using the period as decimal separator) is correctly converted to the original number.

No, it's not. Decimal.description always use . as decimal separator. Decimal(string:locale) parse the string input with locale specific decimal separator:

import Foundation

let french = Locale(identifier: "fr_FR")

print(Decimal(string: "0,3", locale: french)!)  // 0.3
print(Decimal(string: "0.3", locale: french)!)  // 0

Passing nil to locale: uses the system locale so decimal separator can be anything.

This is how I ensure the round trip is correct independent of whatever the user's system default:

let d: Decimal = 0.3
// now parse the .description to get the same value back regardless of user's locale preference:
let d2 = Decimal(string: d.description, locale: Locale(identifier: "en_US"))   // always use "." for decimal separator
// if you do this:
let d3 = Decimal(string: d.description)
// if the user's locale is "fr_FR", you get 0, not 0.3

I don't know what the answer is. But I only see Decimal.description uses "." as decimal separator.

I set the Run scheme to use French locale and the environment(.locale, french), Decimal.description all use ".":

import SwiftUI

struct DecimalAndLocale2: View {
    static let french = Locale(identifier: "fr_FR")
    let d: Decimal = 0.3

    var body: some View {
        VStack {
            let _ = print(d.description)    // 0.3
            Text("\(d.description)")        // 0.3
        }
        .environment(\.locale, Self.french)
    }
}

but if what you say is true that Decimal.description use system preference locale's decimal separator, then I'll have to write my own Decimal.description to use "en_US" locale so I know for sure it only uses "." as decimal separator.

But I see Decimal.description only use "." as decimal separator, not "," ever, even when "fr_FR" is set everywhere.

From the source code one can see that Decimal.description calls NSDecimalString(&value, nil), and that sets the decimal separator to "." if the locale is nil.

For the other direction: Decimal(string) creates a Scanner and then calls scanDecimal. And here I get confused: From the source code it looks to me as if passing nil for the locale makes the scanner use the current locale:

let ds = (locale as? Locale ?? Locale.current).decimalSeparator?.first ?? Character(".")

and with my German locale that would be a comma. On the other hand does Decimal(string:"123.456") treat the period as the decimal separator.

Hmmm...

I'm also seeing the same with the system locale set to French:

Text("Locale: \(Locale.current.identifier)")    // Locale: fr
Text("\(Decimal(string: "123.456")!.description)")  // 123.456  <=== WTF? Not using french locale?
Text("\(Decimal(string: "123.456", locale: Self.french)!.description)") // 123
Test
import SwiftUI

struct DecimalAndLocale2: View {
    static let french = Locale(identifier: "fr_FR")
    var d: Decimal = 0.3

    var body: some View {
        VStack {
            Text("Locale: \(Locale.current.identifier)")    // Locale: fr
            let _ = print(d.description)    // 0.3
            Text("\(d.description)")        // 0.3
            Text("\(Decimal(string: "123.456")!.description)")  // 123.456  <=== WTF? Not using french locale?
            Text("\(Decimal(string: "123.456", locale: Self.french)!.description)") // 123
        }
        .environment(\.locale, Self.french)
    }
}

My observation:

  1. Decimal.description always use "." as decimal separator
  2. Decimal(string: something, locale: nil) also use "." to parse the input so this can parse description output back to the same value.
  3. Decimal(string: something, locale: someNoneNilLocale): parse with that locale's decimal separator. So for "fr_FR": it uses ",".

So what you say is true except nil doesn't seem to use system locale, but whatever locale that uses "." for decimal separator.

This correctly store/retreive from @AppStorage even with system locale changing between run:

extension Decimal: RawRepresentable {
    public var rawValue: String {
        // this always output using "." as decimal separator
        description
    }

    public init?(rawValue: String) {
        // this correctly parse the `description` output no matter what the system locale preference
        self.init(string: rawValue)
    }
}

It's so odd:

From the source of Decimal(string:locale:) , with locale == nil, it then call scanner.scanDecimal(_:) which calls scanner.scanDecimal(), the decimal separator is determined by this:

let ds = (locale as? Locale ?? Locale.current).decimalSeparator?.first ?? Character(".")

which means when locale == nil, it uses Locale.current. But it's not working like this. It's always "."

I copy the line in my test and when the default locale is French, the result is "," (comma):

let locale: Locale? = nil
let ds = (locale as? Locale ?? Locale.current).decimalSeparator?.first ?? Character(".")
Text("ds: \"\(String(ds))\"")

why is this? This doesn't match reality. Is this a bug?

As for Decimal.description, it calls [NSDecimalString(_:_:)] with 2nd parameter nil: (swift-corelibs-foundation/Decimal.swift at main · apple/swift-corelibs-foundation · GitHub), this line result in "." used:

guard locale != nil else { return "." } // Defaulting to decimal point

So Decimal.description definitely always use ".".

My Simple test
import SwiftUI

struct DecimalLocaleParse: View {
    var body: some View {
        VStack {
            // We set the locale in run scheme, this show it
            Text("The system locale: \(Locale.current.identifier)")
            let locale: Locale? = nil
            let ds = (locale as? Locale ?? Locale.current).decimalSeparator?.first ?? Character(".")
            Text("ds: \"\(String(ds))\"")


            VStack {
                Text("locale: nil << omitted, use the default:").foregroundColor(.red).underline()
                // regardless of what the system default is, the default nil
                // always parse with "." as decimal separator
                Text("0.3 >> \(Decimal(string: "0.3")!.description)") // 0.3 <= no matter what the system locale
                Text("0,3 >> \(Decimal(string: "0,3")!.description)") // 0   <= no matter what the system locale

                Text("locale: Locale.current:").foregroundColor(.red).underline()
                // Explicitly set to system default, then depends on system locale
                Text("0.3 >> \(Decimal(string: "0.3", locale: Locale.current)!.description)")   // en_US: 0.3, fr: 0
                Text("0,3 >> \(Decimal(string: "0,3", locale: Locale.current)!.description)")   // en_US: 0, fr: 0.3

                Text("locale: french").foregroundColor(.red).underline()
                // explicitly use French locale: always use "," as decimal separator
                Text("0.3 >> \(Decimal(string: "0.3", locale: Locale(identifier: "fr_FR"))!.description)")  // 0
                Text("0,3 >> \(Decimal(string: "0,3", locale: Locale(identifier: "fr_FR"))!.description)")  // 0.3
                // ^^^ the above also show Decimal.description always output use "." as decimal separator
                //     regardless of what the system locale
                // so:
                // `Decimal(string: aDecimal.description)`
                // reliably recreate the same value of `aDecimal`
                // regardless of what the system default because "." is always used in both way!
            }
        }
    }
}

I'm not able to step into Decimal(string:locale:).

So, I take it that:

Decimal(string: s, locale: Locale(identifier: "en_US"))

works for you, right? What's the problem then?

BTW, to convert to string perhaps it's better to use a more explicit (albeit not so swift friendly) function rather than a generic "description" (I didn't try that):

public func NSDecimalString(_ dcm: UnsafePointer<Decimal>, _ locale: Any?) -> String

It works but the problem is 1) the lack of documentation so you don't know what to expect. 2) the behavior don't match the source, I'm very puzzled by this. 3) no documentation so you don't know what it actually does is correct or it's a bug.

That's what Decimal.description call with the locale parameter set to nil.

See this post for what's going on :).

Yep, documentation could be better. In such cases I'd refer to the corresponding NSDecimalNumber documentation which Swift's Decimal is bridged from:

Don’t use this initializer if numericString has a fractional part, since the lack of a locale makes handling the decimal separator ambiguous. The separator is a period in some locales (like in the United States) and a comma in others (such as France).

In swift terms this is about passing nil to the locale parameter of:

init?(string: String, locale: Locale? = nil)

To parse a numeric string with a fractional part, use init(string:locale:) instead. When working with numeric representations with a known format, pass a fixed locale to ensure consistent results independent of the user’s current device settings. For localized parsing that uses the user’s current device settings, pass current .

As per this advice pass EN_US explicitly in your "fixed format" case.

Interestingly the header has this about Decimal.decsription:

Calling this property directly is discouraged.

Don't know exactly why though. I always treated "description" as something which format can change on a whim, like extra spaces or quotes would be added, etc, but maybe I'm just too paranoid and it is more stable in fact.

Much appreciate for some very useful info!

I need to make Decimal work with @AppStorage by conforming it to RawRepresentable, RawValue == String. Using Decimal.description and Decimal(string:locale) is the only way I can find that makes the persistent round trip result in the same value. Using Decimal.FormatStyle the value come back from string form is not the same:

let ddd = Decimal(string: "1")! / Decimal(string: "3")!

// using FormatStyle to convert string and parse back result in different value!!!
let formatted = ddd.formatted(.number.locale(Locale(identifier: "en_US")))   // err, how to make unlimited precision?
print(formatted, ddd.description)   // 0.333333 0.33333333333333333333333333333333333333
let comeback1 = try Decimal(formatted, format: .number.locale(Locale(identifier: "en_US")))
print("equal nor not: ", ddd == comeback1)  // false

let comeback2 = Decimal(string: ddd.description)!
print("equal nor not: ", ddd == comeback2)  // true

I don't think DecimalFormatStyle is appropriate for persistent purpose, its purpose is for localized UI.

Decimal adopts Codable so shouldn't that work for this?

Edit: use Codable to convert to/from String:

extension Decimal: RawRepresentable {
    public var rawValue: String {
        String(data: try! JSONEncoder().encode(self), encoding: .utf8)!
    }

    public init?(rawValue: String) {
        guard let data = rawValue.data(using: .utf8), let decoded = try? JSONDecoder().decode(Self.self, from: data) else {
            return nil
        }
        self = decoded
    }
}

so much ! all over. But I don't see what can be done about try and optional unwrap failing other than simply crash?

Except you will suffer from the going-via-Double-issue when using JSONEncoder/JSONDecoder:

let d = Decimal(rawValue: "3.133")!
print(d) // 3.132999999999999488

I read that thread and forgot. Thanks for reminding me.

So Decimal JSON encode/decode can lose precision, and slow. Is this something that needs fixing?

edit: not sure why I get the exact value back:

extension Decimal: RawRepresentable {
    public var rawValue: String {
        String(data: try! JSONEncoder().encode(self), encoding: .utf8)!
    }

    public init?(rawValue: String) {
        guard let data = rawValue.data(using: .utf8), let decoded = try? JSONDecoder().decode(Decimal.self, from: data) else {
            return nil
        }
        self = decoded
    }
}


let dxxx = Decimal(rawValue: "3.133")!
print(dxxx) // 3.133

Accessing the description property of a type directly is discouraged, as is noted in within the CustomStringReflectable protocol's documentation. Instead, use the String(describing:) initializer, which works with any type — even types that don't conform to CustomStringReflectable.

3 Likes
Terms of Service

Privacy Policy

Cookie Policy