> Yes. Because StringLiteralConvertible inherits from
ExtendedGraphemeClusterLiteralConvertible,
ExtendedGraphemeClusterLiteralConvertible inherits from
UnicodeScalarLiteralConvertible.
Yes, I know. But I wonder why that is, considering that something that is
StringLiteralConvertible will seemingly never use the required initializer
from UnicodeScalarLiteralConvertible.
I don't know it
either. Maybe it is because when you convert a String to Unicode, you will
need the other two init()s. As you examples show, there maybe one or two
Ints to make one Character.
let a : Int = "\u{65}" // e
let b : Int = "\u{65}\u{0301}" // é
let c : Int = “hello"
I wanted to know if there was a way to write something like (for example)
uscalar”\u{65}”, that would only be a unicode scalar literal, and not a
string literal. Or if there was any other way to tell the compiler “please
use the initializer from UnicodeScalarLiteralConvertible instead of the
one from StringLiteralConvertible”.
I guess the more fundamental question I wanted to ask was “why must
StringLiteralConvertible conform to UnicodeScalarLiteralConvertible?”
For "\u{65}" , in your words, you call it a Unicode Scalar Literal. But in
formal, it is a String Literal that include special characters.
String literals can include the following special characters:
-
The escaped special characters \0 (null character), \\ (backslash), \t (horizontal
tab), \n (line feed), \r(carriage return), \" (double quote) and \' (single
quote)
-
An arbitrary Unicode scalar, written as \u{*n*}, where *n* is a 1–8
digit hexadecimal number with a value equal to a valid Unicode code point
If you want to use a UnicodeScalar, you should use struct UnicodeScalar
<UnicodeScalar | Apple Developer Documentation.
Here are the examples.
let uScalar = "a".unicodeScalars.first! // 97
print(uScalar.dynamicType) // UnicodeScalar, not an Int
extension UInt32 {
func hex() -> String {
return String(format:"0x%2X", self)
}
}
var aUnicodeScalar = UnicodeScalar(97)
print(aUnicodeScalar.value) //97
aUnicodeScalar = UnicodeScalar(0x61)
print(aUnicodeScalar.value) //97
print(aUnicodeScalar.value.hex()) //0x61
aUnicodeScalar = UnicodeScalar("a")
print(aUnicodeScalar.value) //97
aUnicodeScalar = UnicodeScalar("e")
print(aUnicodeScalar.value.hex()) // 0x65
aUnicodeScalar = UnicodeScalar("é")
print(aUnicodeScalar.value.hex()) // 0xE9
aUnicodeScalar = UnicodeScalar("\u{65}")
print(aUnicodeScalar.value.hex()) // 0x65
// aUnicodeScalar = UnicodeScalar("\u{65}\u{0301}")
// above doesn't work as there are two characters instead of one
// struct UnicodeScalar only conforms UnicodeScalarLiteralConvertible
extension UnicodeScalar:ExtendedGraphemeClusterLiteralConvertible {
public init(extendedGraphemeClusterLiteral value: String) {
self = String(value.characters.first!).unicodeScalars.first!
}
}
aUnicodeScalar = UnicodeScalar(extendedGraphemeClusterLiteral:
"\u{65}\u{0301}")
print(aUnicodeScalar.value.hex()) // 0x65
zhaoxin
···
On Mon, Jan 11, 2016 at 10:08 PM, Loïc Lecrenier <loiclecrenier@icloud.com> wrote:
> Yes. Because StringLiteralConvertible inherits from
ExtendedGraphemeClusterLiteralConvertible,
ExtendedGraphemeClusterLiteralConvertible inherits from
UnicodeScalarLiteralConvertible.
Yes, I know. But I wonder why that is, considering that something that is
StringLiteralConvertible will seemingly never use the required initializer
from UnicodeScalarLiteralConvertible.
> Is there a way to write something that is a unicode scalar literal, but
not a string literal?
> Yes. You have already done it by extension
UnicodeScalarLiteralConvertible only. String literal is what people read.
Unicode is something string encoding to store in computer and the computer
read.
>
> for example:
>
> let uScalar = "a".unicodeScalars.first! // 97
> print(uScalar.dynamicType) // UnicodeScalar. NOT an Int
Sorry, I am not sure what this answers :(
I wanted to know if there was a way to write something like (for example)
uscalar”\u{65}”, that would only be a unicode scalar literal, and not a
string literal. Or if there was any other way to tell the compiler “please
use the initializer from UnicodeScalarLiteralConvertible instead of the one
from StringLiteralConvertible”.
I guess the more fundamental question I wanted to ask was “why must
StringLiteralConvertible conform to UnicodeScalarLiteralConvertible?”
Thanks,
Loïc
>
> On Mon, Jan 11, 2016 at 4:54 AM, Loïc Lecrenier <swift-users@swift.org> > wrote:
> Hi :)
>
> I have been trying to understand the StringLiteralConvertible protocol,
but there is something that I still can’t explain.
>
> //-----------------------------
>
> extension Int : UnicodeScalarLiteralConvertible {
> public init(unicodeScalarLiteral value: UnicodeScalar) {
> self = 1
> }
> }
>
> extension Int : ExtendedGraphemeClusterLiteralConvertible {
> public init(extendedGraphemeClusterLiteral value: Character) {
> self = 2
> }
> }
>
> extension Int : StringLiteralConvertible {
> public init(stringLiteral value: String) {
> self = 3
> }
> }
>
> let a : Int = "\u{65}" // e
> let b : Int = "\u{65}\u{0301}" // é
> let c : Int = “hello"
>
> //-----------------------------
>
> If I only write the first extension: I can only initialize a, and its
value will be 1.
> If I write the first two extensions: I can initialize a and b, and their
values will be 2.
> And if I keep the three extensions: a, b, and c will all have a value of
3.
>
> So it seems like the compiler prefers calling the initializer from (in
order of preference):
> 1. StringLiteralConvertible
> 2. ExtendedGraphemeClusterLiteralConvertible
> 3. UnicodeScalarLiteralConvertible
>
> But for something to be StringLiteralConvertible, it needs to be
ExtendedGraphemeClusterLiteralConvertible and
UnicodeScalarLiteralConvertible, which means I have to define two
initializers that will never be called.
>
> Is that correct?
> Is there a way to write something that is a unicode scalar literal, but
not a string literal?
>
> Thank you,
>
> Loïc
>
>
>
>
>
>
> _______________________________________________
> swift-users mailing list
> swift-users@swift.org
> https://lists.swift.org/mailman/listinfo/swift-users
>
>
>
> --
>
> Owen Zhao
--
Owen Zhao