TL;DR
Protocols express the requirements to be something. Types express the notion that a thing is something. Constraints (constexpr or otherwise) express the requirements to be used for something. Depending on what you want in any specific situation, you choose the appropriate of those three mechanisms. Examples like the aforementioned âHeightâ one tend indeed to want for [constexpr] constraints.
Reasons
Itâs interesting (and complicating) that what this is trying to express is not just typy nor just protocolly. In some sense youâre just trying to say âI want an Int that meets these requirementsâ - which sounds protocolly, but then when you think about it, does it make sense to have a Height
protocol in the traditional sense, i.e. where technically it can be any type, not just an Int
? In a pure Protocol world that could make sense, since Int
would merely be a protocol that expresses the idea of a thing which has the set of properties & capabilities that distinguish an integer. But thatâs nowhere near what Swift is today, and seems obviously not a place Swift will ever go.
If you nonetheless try pragmatically to just add the ability to require specific typeness to protocols, e.g.:
protocol Height: Int where self > 0, self < 10 {}
Or:
protocol Height where Self is Int, self > 0, self < 10 {}
Thereâs all sorts of awkward things about that. It blurs the lines between types & protocols, for one. You havenât actually conformed Int
to the protocol either, yet - youâd have to add the extension to Int
as well, which is a lot of redundancy & boilerplate. And in principle, since itâs a protocol, you can put all the normal protocol stuff between the curly braces, which⌠well, probably thatâs actually useful in some cases, but it doesnât negate the prior concerns and might amount to feature creep.
So that suggests this should be a bit more typy, e.g.:
subtype Height is Int where self > 0, self < 10
That seems much cleaner at the outset (even though it still blurs typiness & protocolness a bit).
A new keyword & associated concept would be required, as opposed to repurposing typealias
or similar, because of the need to support any type, not just classes, as well as the fact that this is not merely an alias (as @CTMacUser pointed out). I think itâs also best to be distinct, syntactically, vs subclassing - i.e. not go down the route of allowing:
struct Foo {
...
}
struct SubFoo: Foo where self > 0
Itâs too close to subclassing, and thus people will forever be misguided into wanting support for all the rest of subclassingâs capabilities (new properties, new functions, overrides, etc), which donât make sense by definition for a constraining subtype (nor value types in general in the presence of the extension
keyword & mechanism).
Alternatively there is a âwrapperâ approach, where you declare e.g. âHeightâ as a new type which stores an Int
internally and applies the conformances through constexpr constraints specifically on its init
method(s), but thatâs an impractical amount of boilerplate for any non-trivial base type, e.g.::
struct Height {
private var actualValue
init(_ value: Int) where value > 0, value < 10 {
self.actualValue = value
}
}
// And now a bajillion extensions to make Height conform to all the protocols
// expected of an Int, assuming there even are protocols to fully cover what
// an Int is (hint: there arenât, today)...
// ...and you still have to do âmyHeight.asIntâ or somesuch everywhere you need
// to use virtually any existing code, since `Int` is a type, not a protocol. This
// approach is very hostile to the extensions paradigm.
So, given all that a subtype
keyword et al seems like the best approach (at least vs protocols or wrappers).
Then, several obvious questions are:
Do subtype
s manifest as new Metatype
s?
My intuition is that they should, for several reasons:
-
Code that calls type(of: ...)
and the like, and certainly anything that conveys the type of the thing to a human, would be more intuitive to call it a Height
than an Int
. It would also be a valuable distinction in reasoning about code using generics or compound types, since while technically an e.g. array of Height
s and an array of Int
s would be equivalent at runtime, semantically they are distinct.
-
Being able to runtime conditionalise behaviour based on compile-time ensured and stated compliance to a subtype could be useful just as it can be useful to downcast class instances. e.g.:
subtype EvenInt is Int where self % 2 == 0
func halve(_ number: Int) -> Int {
if number is EvenInt {
// Fast path.
return number >> 1
} else {
// Some other, more complicated code path to handle Ints that might not divide evenly.
}
}
That example is deliberately âcontrivedâ on first glance, to illustrate a subpoint - yes, the âslow pathâ still has to handle all the cases anyway, and you could get this specific exampleâs same behaviour in principle by just writing if number % 2 == 0
directly (or maybe with generics and specialisation that way), but having it presented via typing:
- Makes the codeâs intent clearer just as using types & protocols in general do.
- Is a convenient shorthand for what amounts to a constexpr that can be applied rigorously and consistently across a codebase (though I guess weâll see how constexprs ultimately get implemented, w.r.t. whether they end up with an equivalently convenient stamping mechanism anyway).
- Is more flexible or at least concise than generic specialisation, in many cases (e.g. where itâs a two hundred line method of which only one line changes depending on the true type of the argument(s)).
However, this does again highlight how from a lot of angles this feels very protocolly.
-
You can then more sensibly require explicit upcasting (i.e. requiring someHeight as Int
whenever you pass the value to a parameter expecting an Int
), because the true type would be preserved under the covers, allowing for downcasting back to the leaf type at any later stage (if needed).
Iff, of course, itâs actually possible to track all that at runtime (my presumption is that value types donât have any kind of âisaâ attribute under the covers, and class types do, which is presumably how Swift actually does downcasting todayâŚ?). And itâs not a given that such functionality is required - though IMO itâs very nice, in a type system, for it to dissuade âescapesâ (the careless or unintentional loss of typing information, especially in a way that propagates that loss forwards, as implicit conversions do).
Itâs not a logical requirement that implicit upcasting be disallowed, but it would be consistent with the premise that these subtypes are semantically distinct enough to warrant dedicated syntax for them. It also fits well enough with (IMO) Swiftâs intent that functionality be explicitly attached to intended types, not âfloatingâ - since an extensions to Int
would still apply to Height
.
That also leads to the next obvious big question...
What does <some Height> + <some Int>
do?
The resulting type cannot unconditionally meet the requirements, since the addition could produce a value thatâs outside the requirements of Height
. A reasonable default would therefore be to pick the least specific type, i.e. Int
in this case, but that means weâve added (back) into Swift implicit type conversion, essentially (granted itâs an upcast, not a lateral cast, but still).
But furthermore consider, instead of an operator, what about a method on Int
that returns Int
- what does that do when called on an Int
that is specifically a Height
? Does it still return an Int
? It would have to - we canât unilaterally guarantee that the result of all methods on Int
that return Int
(or Self
if expressed that way) would actually return a value conforming to Height
. And it would be a bit absurd to have to do (myHeight as Int).someMethodOrProperty
for all such methods - you should just use the âwrapperâ pattern at that point, without any new language support needed.
And any mechanism that would require Height
to override every such method in order to ensure the returned values are also Height
s is impractical if not impossible - for example, how would it handle a negated
method sanely when the result is naturally not a valid Height
, but the method on Int
is not declared with throws
nor returns Optional
or the like. Etc.
So logically any method on the parent type:
- Cannot implicitly return the subtype (not even via
Self
or any other such genericness).
- Must nonetheless be supported on instances of the subtype without any extra boilerplate.
So at that point, whatâs the point? Anything you do to a Height
will turn it âbackâ into a raw Int
, and that tiny remaining functionality - of semantic distinction in passing values around unmodified - can easily be achieved using a wrapper type.
Alternatively, if the compiler were willing to sort of âdynamicallyâ change the method signatures - e.g. automatically make throwable or the return value Optional
any methods which arenât provably going to return a a valid Height
- then I guess that could work⌠but thatâd be a pretty complicated task for the compiler (and might require constexpr and similar advancements first, anyway)...
Conclusion (?)
...all of which seems to lead to the apparent conclusion that this should not be implemented by actual subtyping (nor protocols), but instead âmerelyâ via constraints on values at their individual use sites (e.g. via constexprs).
What we then need is a way to declare a âmacroâ for those constraints, to get comparable convenience & benefits to the subtype
approach, e.g.:
constexpr func IsValidHeight(_ value: Int) -> Bool {
return value > 0 && value < 10
}
...
func someFunc(_ value: Int) where IsValidHeight(value) {
...
}
Though naturally you still want to be able to apply the same constraints to stored valuesâ types, and now we discover an infinite loop since the apparent way to do that would be something like:
<some keyword> Height = Int where IsValidHeight(self)
Thatâs really just a convenient âmacroâ for, at every stored value declaration:
var height: Int where IsValidHeight(height)
And then if you think about it that way, maybe typealias
really is essentially all thatâs needed (given constexprs)âŚ?
Addendum
In any case, what actually interests me - beyond just âplainâ constexprs - is the idea that the Swift compiler could automatically de-try and de-Optional based on parameter analysis & the functionâs implementation, where appropriate (i.e. not across boundaries that are supposed to represent stable APIs). This wouldnât be a replacement for any of the above necessarily, but could be complimentary, e.g.:
func setHeight(_ value: Int) throws {
guard let height = value where value > 0 && value << 10 else {
throw SomeError
}
...
}
...
setHeight(5) // Look ma, no âtryâ!
Swift knows the argument â5â canât cause an exception by static analysis, so it allows the âtryâ to be elided. Thatâs the âcarrotâ side - the âstickâ side could be similarly enhanced vs today:
try! setHeight(11) // Compile-time error: Swift knows it will always throw and therefore always abort.
try? setHeight(11) // Compile-time warning: Swift knows it will always throw.
// The try? case could even detect that `setHeight` has no side-effects, omit code for it
// entirely, plus warn that the statement has literally no effect.
To me that seems like a much bigger win in practice (even if it only works within a single module, as my intuition suggests, without more effort from the compiler and/or programmer).
Then, adding constexpr to the mix generalises this fully.