Why does Decimal(string:) allow alphabetic characters?

See the example code. I would expect a and b should be nil, but it seems the rule is that the init() stops parsing further (though it keeps the previous result) when it finds the first invalid character.

let a = Decimal(string: "1234abc")
let b = Decimal(string: "1234abc56")
let c = Decimal(string: "abc1234abc")

print("\(a)") // Output: 1234
print("\(b)") // Output: 1234
print("\(c)") // Output: nil

I googled about this behavior but didn't find any result. My intuition (as an average user) is that a and b should be nil. I currently work around the issue by checking if the string contains only decimal digits before I call Decimal(string:). But I wonder what's the rationale behind the behavior?

Background: I find the currency format support in SwiftUI's TextField is quite buggy. So I'm trying to implmenet a simple one. I use Decimal(string:) to check and convert user input in TextField and run into this issue.

This is a known issue — SR-5326. This initializer is pretty lenient, and doesn't fail when hitting invalid characters. There may be room to change it, but backwards-compatibility with existing cases that "work" (although maybe they shouldn't) makes things tricky.

3 Likes

My practical approach to issues like these: I'd put this workaround in my own library of things:

extension Decimal {
    init?(_ string: String, _ locale: Locale? = nil) {
        if check string here {
            return nil
        }
        self.init(string: string, locale: locale)
    }
}

After that I'd treat this init as if it was part of Foundation and effectively stop worry about it (perhaps until few years in the future when I revise it and possibly throw away if it's fixed in Foundation/Swift/standard library by then).

1 Like