When checking for Integers, it seems that MyType(literal) is evaluated during compilation time, as doesn't compile when the literal is too large to fit the integer type.
My question is whether there is a difference between the expressions
let x: MyType = literal // first expression
let x = MyType(literal) // second expression
let x = literal as MyType // third expression
Are these expressions equivalent (checked during compilation and treat the expression on the right side as a constant) and are a matter of style / were different in past, or there are subtle differences? If there are - could someone explain the differences or point to valid place in the docs?
Thanks.
As of Swift 5.0 due to SE-0213, MyType(<#literal#>) and <#literal#> as MyType are equivalent when <#literal#> is some literal and MyType conforms to the ExpressibleBy* protocol matching that literal.
The complete rule:
Given a function call expression of the form A(B) (that is, an expr-call with a single,
unlabelled argument) where B is a literal expression, if A has type T.Type
for some type T and there is a declared conformance of T to an appropriate literal protocol
for B, then A is directly constructed using init witness to literal protocol
(as if the expression were written "B as A").