Character vs. String literals

We have all watched the evolution of the String class, and the amazing jockeying required to accommodate Unicode combining characters and the like. At this point, the distinction between Characters and Strings and Arrays of Characters (myString.characters) is becoming clearer.

But in the source code, there is ambiguity. What is "a", a character literal or a string of length 1? I suggest reinstating the C convention of using single-quotes for delimiting Characters.

What are you looking to solve for here? Have you encountered specific bugs because of this?

You can usually disambiguate by investigating where resolution takes place (using an IDE like Xcode right now makes this a lot easier).

    let str = "a" // String takes precedence. Option-click `str` in Xcode to see that it's a String
    let chr: Character = "a" // It's a character :)

    funcTakingCharacter("a") // Option-click `funcTakingCharacter` to see that it's takes a Character
    funcTakingString("a") // Same

Anything conforming to StringLiteralConvertible can be determined similarly. Where there _is_ actual ambiguity, an explicit "as T" is required.

Stephen

ยทยทยท

On Dec 20, 2015, at 4:44 PM, Andrew Duncan via swift-evolution <swift-evolution@swift.org> wrote:

We have all watched the evolution of the String class, and the amazing jockeying required to accommodate Unicode combining characters and the like. At this point, the distinction between Characters and Strings and Arrays of Characters (myString.characters) is becoming clearer.

But in the source code, there is ambiguity. What is "a", a character literal or a string of length 1? I suggest reinstating the C convention of using single-quotes for delimiting Characters.