Give Int an initializer that takes a single Character

Int has a failable init that takes a String and attempts to parse an integer value out of it. However, if you have a Character, you have to build a String out of it before you can parse out the potential Int it represents.

Current:

let string = "1 is the loneliest number"
let loneliestNumber = Int(String(string.first!))

I propose we add a convenience init to Int that takes a Character and a defaulted radix.

convenience init?(asciiDigit character: Character, radix: Int = 10)

Proposed:

let loneliestNumber = Int(asciiDigit: string.first!)

You can also parse textual representations of characters of other radices:

let string = "ABBA is my favorite band!"
let oneOfUs = Int(asciiDigit: string.first!, radix: 16)

Alternatives considered:
Make Character conform to StringProtocol. You cannot (currently) make a type a collection who’s element is Self.

Aside:
There may be other odd gaps in the API where we currently only take Strings but it would also make sense to add an overload for Character, and it would be good to lump those in with this once we get to the proposal stage. If there are any you are aware of, please make it known and we can consider adding them along with this!

1 Like

I may be misunderstanding this pitch, but if I read this right:

let one: Character = "1"
let ten: Character = "a"
Int(one) == Int?.some(1)
Int(ten, radix: 16) == Int?.some(10)

If this is the case, I’m not sure I think this is something that comes up very often in my experience. Plus, this initializer may add confusion to those coming from languages where char is an integral type: some may be mislead to believe that it gives the ASCII value of the Character instead of its digit it represents. Due to this, I think that forcing a conversion to String a great way to document what you’re doing.

2 Likes

Your interpretation is correct. This would do the exact same thing as the init that takes a StringProtocol.

I’ll admit, I don’t hit this a lot. But it is one of those things where it seems silly to have to make a heap allocation for a String to just parse out the Int from a Character you already have.

I think the potential for confusion is real, however if I were trying to get the ascii value of a character, my natural inclination is to look for some asciiValue property on the character rather than passing it to an Int init. This also behaves identically to API we have already so I feel this helps reduce the confusion.

I don’t suggest this. I was considering, the other day, that we don’t have a great story for non-ascii/roman numbers and conversion. I think that Int(one) == Int?.some(1) would worsen that issue. Int(ten, radix: 16) == Int?.some(10) is not an issue because of the label.

To be clear

let ni: Character = "二"
Int(ni) == Int?.some(2)

and the like are what I am talking about.

Once we get short-string optimizations as discussed in this thread (similar to NSString tagged-pointers) then there won’t be any heap allocation for this scenario. Hopefully that should land by Swift 5.

1 Like

We have some discussion going on over in this thread about adding various properties (like numeric value) to Unicode.Scalar (and eventually, likely to Character as well).

(I really need to finish the actual proposal…)

You may be interested in reading over that discussion and commenting if those plans don’t satisfy your needs.

In this scenario where you extract the Character in question from a given string you could work with a Slice aka String.SubSequence to avoid the intermediate allocation of a new string, e.g:

let string = "1 is the loneliest number"
let loneliestNumber = Int(string[string.startIndex...string.startIndex])

We have some discussion going on over in this thread4 about adding various properties (like numeric value) to Unicode.Scalar (and eventually, likely to Character as well).

I think the numeric properties thread is serving a different need. An ultra-optimized Int.init(_: Character) (or rather, on the relevant protocol) that was ASCII-only and followed the same rules as Int.init(_: String) would be useful independent of that (also very useful) feature.

In this scenario where you extract the Character in question from a given string you could work with a Slice

Yes, but often you’re handed a Character rather than a String to extract one from. And even if you’ve got the string and indices too, re-slicing can be an expensive activity. I think there’s a good performance/readability justification for adding an init from Character too.

The existing failable inits with unlabeled parameters are a source of exponential type checking time.

I would not want to see more added without first addressing that issue.

I think we need to transition off of the existing failable inits in favor of a labeled form and then work toward requiring the label as part of the function name if you want to refer to the function without directly invoking it (e.g. in something like ["1", "2"].map(Int.init) would no longer type check since it’s not explicitly naming the overload to use).

1 Like

So Int.init?(asciiDigit: Character, radix: Int = 10) would probably work, and also help clarify the behavior.

edit: sorry, been leaving my ? off all over the place…

SSO would remove the allocation concern but I still feel that constructing a String to parse the digit out of a Character is overkill

This might be obvious for someone who’s worked with Swift, because they’d know how the standard library is organized, but to someone who’s coming to Swift from say C or Java I think the though process behind it may look more like “Hmm, Int(2.5) looks and acts like a cast, so I’m sure Int(string.first!) is similar. If I cast a Character, I should get an ASCII value, right?”.

The user has to cross the “initializers aren’t casts” bridge at some point so it might as well be here. They’ll try the wrong thing, see the result then know what it does from then on.

That said, adding an argument label as Ben suggested would help clear this up for new users, and helps with the typechecker speed problem. I’ve revised the pitch accordingly.

2 Likes

Is this going anywhere?

Becasuse I have faced an educational example where input numeric values are given in string, but this is not working.

let numbers: [Int] = digits.map{$0}.map {
     return Int($0)!
}

So it has to be this?

let numbers: [Int] = digits.map{$0}.map {
     return Int(String($0))!
}

Which is horrifying solution, because of allocation overhead. Is this proposal dead?

Now we have unicode wholeNumberValue property on Character that you can use

let chars: [Character] = ["4", "④", "万", "a"]
for ch in chars {
    print(ch, "-->", ch.wholeNumberValue)
}
// 4 --> 4
// ④ --> 4
// 万 --> 10000
// a --> nil
7 Likes

Ok, I ll live by it. Thanks.

Oh, and by the way, you don't have to use map twice, this will work

let chars: [Character] = ["4", "④", "万", "a"]
let numbers = chars.compactMap(\.wholeNumberValue)
print(numbers) // [4, 4, 10000]
let numbersAndNils = chars.map(\.wholeNumberValue)
print(numbersAndNils) // [Optional(4), Optional(4), Optional(10000), nil]
1 Like

+1 to this idea. I have encountered this numerous times, most notably when trying to convert a string to an array of digits:

let reallyLongNumber = "12940214297391293"
let digits = reallyLongNumber.map { Int("\($0)")! }

While this can also be done with the character property .wholeNumberValue, I think that an Int initializer is probably the first thing that people instinctively reach for considering that Int has one that does the same for a string.

Just to be clear, is this what people are asking for?

extension FixedWidthInteger {
  init?(_ c: Character, radix: Int = 10) {
    self.init(String(c), radix: radix)
  }
}
Terms of Service

Privacy Policy

Cookie Policy