This is discussed in the proposal text, under "Alternatives Considered."
I agree with Avi that I find it difficult to support this proposal. The only motivating example in the proposal that is not solved by a future direction is the one with a clamping property wrapper, but even in that case the problem it solves is so minor that it doesn't seem worth adding to the language to fix.
Yes, you can guarantee that the bounds you initialize with are known at compile time, but the bounds themselves are still runtime properties. I don't see the point.
Without a new motivating example which is actually solved by the proposed feature, I cannot support this proposal. -1
I appreciate the potential parallel, but it strikes me that @frozen
is making a much stronger, and higher level, promise than the proposed @const
.
The @frozen
annotation makes a promise at the level of library evolution, that is, it makes a promise that is relevant across different versions of the same library. It is a promise about how the code itself will change, not just a promise about something that happens at the instant of a single compilation. Notably, @const
is not relevant only for resilient libraries.
I could see an argument that @const
values should also be required to be marked @frozen
in resilient libraries (as per the Effect on ABI stability and API resilience section), but I think it's better if @frozen
remains a concept that is only relevant in library evolution mode.
I’m not sure how much stronger @frozen
is for public
types than @const
is for public
properties:
The new function parameter attribute is a part of name mangling. The value of
public @const
properties is a part of a module's ABI. See discussion on Memory placement for details.
At the level of library evolution, this promise not to change the value of an ABI-visible @const
member seems ironclad (deep-freezed?); indeed, there are a restricted but nonzero number of things one can change about a @frozen
type, which is more than what I can see changing about a @const
value.
Beyond guarantees regarding how types may change in the future, @frozen
also enables exhaustive switching for an enum in the here-and-now. Although the attribute @frozen
is only used in resilient libraries, the concept of a frozen enum exists in the language whether or not you’re dealing with a resilient library. It’s in fact nonfrozen enums that exist as a concept only in library evolution mode.
I agree with you that, in terms of how these two features have been presented in their respective proposals, there has been a difference in emphasis on the evolution versus here-and-now consequences of using these attributes. But I’m not convinced that when we peel back to study the features themselves @const
is really distinguished from @frozen
on this basis.
Agreed that for resilient libraries, @const
is highly analogous to @frozen
. But that doesn't, in my view, translate into an argument that we should bring @frozen
into the realm of non-resilient libraries.
I'd view this in the other direction—an exhaustive enum is the underlying concept here, and in situations where source is available, every enum is exhaustive because every compilation is in a sense, independent. I don't think it's really reasonable to call enums in non-resilient libraries "frozen", since they are of course allowed to change, add cases, remove cases, etc.
Agreed, though I'd again shift the framing to "nonexhaustive enums" being the concept that only exists (and is indeed the default) in resilient libraries. Then, @frozen
is the way for resilient libraries to recover the (non-resilient default) exhaustive behavior for enums.
Aside from concerns about diluting the seriousness of the promise made by @frozen
by bringing it to non-resilient libraries, I think it's just the wrong word to describe what's going on. IMO it fails the same test you mentioned above for @const
:
Whereas @const
may difficult to distinguish in meaning from the existing "cannot change across a single execution of the program", @frozen
goes too far in the other direction and (IMO) expresses "cannot change across multiple compilations of the library." We're looking for a label that expresses something in the middle, roughly, "cannot change across multiple executions of the same compilation of the program."
If I understand correctly, the current feature (and its implementation) do not change the "physical" ABI for variables declared @const
; such variables are accessed by calling a function that returns the value, i.e. with a getter. At most, @const
becomes a semantic guarantee that the function has no significant side effects and always returns semantically-equivalent values, which we could certainly use during optimization by e.g. coalescing redundant calls or removing unused ones. However, it would still be necessary to destroy the returned value (if it isn't trivial), because the return values wouldn't necessarily be permanently allocated, which would complicate that optimization.
This is clearly not the optimal ABI for accessing constant variables, which would be either:
- to make the variable simply resolve to an address known to be initialized prior to the access, or if not that,
- to call an "addressor" function which returns a consistent address of an immutable value.
In the first option, a @const
global variable would define a symbol that simply resolves to the address of an immutable object known to be initialized at load time, and a @const
protocol requirement would cause the protocol witness table to contain a pointer to an immutable object initialized either at load time or, at worst, during the initialization of the witness table (e.g. if the conformance were generic and the value was dependent on generic parameters). In the second option, a @const
global variable would define a global function symbol for the addressor, and a @const
protocol requirement would cause the protocol witness table to include an entry for an addressor (rather than for a getter, as it would today). The addressor would then either just return the address of an immutable object, if one can be emitted statically, or else memoize the allocation and initialization of that object.
The first of these is clearly more efficient for accesses. It would also allow the value to be fairly easily recovered by binary analysis (as opposed to either source analysis or running code — all of these options have their own trade-offs for different applications). However, it would require the memory to be eagerly initialized before access, which in the most general case would require load-time execution in order to compute type layouts and produce unique metadata. To avoid this and guarantee that the initialization could be done "statically" (which is to say, within the limitations of what common program loaders can do automatically without running any code from the loaded image), the following restrictions would be required:
-
Initialization would have to be resolved to a fully concrete initializer value. This means that all of the semantics of initialization for the initializing expression would have to be known to the variable's defining module and, furthermore, be constant-evaluable. Among other things, this would imply that all the types involved are
frozen
(or defined in the module), all the initializers areinlinable
(or defined in the module), etc. We seem to want this restriction regardless, and in the initial proposal the restrictions are much stricter than this and exclude all user-defined types; I mention it only for clarity. -
The internal layout of every component value in the initializer value would have to be known statically. This is almost implied by the restriction above, since resilient types cannot have
inlinable
initializers. However, the internal layout of anenum
includes direct cases that aren't necessarily part of the value and therefore do not need to be initialized; if such a case included a non-frozen
type, this would preclude the direct-address implementation because the internal layout of the enum would not be known, even if that case were not chosen for the actual initializing value. Again, I believe this is implied by the current restrictions in the proposal, but it should be noted for future directions. -
Any class instance or metatype value appearing in the initializing value would have to fall into one of the cases where Swift's type metadata system can guarantee complete emission at compile time. (This is an implementation detail we haven't previously needed to expose to programmers.) For example, if the initializing value includes an instance of a generic class
MyClass<MyX>
, bothMyClass
and the concrete generic argumentMyX
would be heavily restricted. Some of the conditions for this overlap with the restrictions above, but not all of them. I believe this restriction probably wouldn't extend to indirect enum cases. Once more, I believe this is implied by the current restrictions in the proposal, unless perhaps we need unique metadata for array and dictionary buffers.
These restrictions would not be necessary with the "addressor" approach, which allows the variable to be emitted lazily at the cost of making accesses somewhat more expensive. But someone might say that achieving the direct-address implementation would be desirable enough to design these extra restrictions in. I'm not sure I would agree, but it's not completely unreasonable.
The most important thing here is that, if we want to be able to use either of these better ABIs for @const
variables, we do actually need to do that ABI work in the first release. Otherwise, @const
alone won't be enough, and we'll need to introduce a new attribute in the future which actually requests the new ABI treatment. That seems to me like it would partially undermine the story laid out in this proposal for how the proposed attribute will be gradually generalized to address more and more constant-evaluation needs. @const
would enable some semantic optimization and source-tool analysis, but we'd be fundamentally limited by these early ABI decisions about what low-level features we could build on top of it. For example, we would not be able to say that a pointer to a @const
variable has global lifetime.
If we want the direct-address ABI specifically, then as part of that ABI work, we will also need to ensure that we can actually emit String
, Array
, and Dictionary
literals statically, since those are the only complex types allowed by the current proposal. The optimizer would then presumably be free to rely on a guarantee that any allocated objects in the immutable object are in fact permanently allocated and have trivial reference counting. This work could be elided if we just use addressors, although of course the optimizer would then lose its ability to rely on permanent allocation. But permanent allocation might not actually be feasible in the long run if we hope to include general class types in the set of things that can be constant-emitted, since a class instance stored in a @const
generic variable would semantically need to be unique for a set of generic arguments.
Is this essentially talking about the cost of the call to swift_once
?
Because in all the benchmarking I've done, swift_once
is always negligible. Like, 0.0%.
Currently, there is also a lot of setup code for globals in main
. I've never understood why we have that setup code and have thread-safe lazy initialization. Why both?
It would be the dynamic cost of swift_once
plus the code-size and optimization-barrier costs of doing the setup for it, yes.
I’m not aware of any eager setup costs for normal globals that would go into main
. But globals in script files are strange and have their own, not always sound rules.
Regarding the "Protocol @const
property" aspect of the proposal, I'm struggling to see the motivation.
If the intent is to enforce that the property's value is "static" (eg, make sure the database keys or printf-string are not runtime-able values), then that should be expressed in the type, as you'd want to propagate this quality through function calls, even if you select among values at runtime (flag ? key1 : key2
). Expressing frozen/static typing is interesting and useful independent of this proposal.
If the intent is to enforce that the value doesn't change for an instance/class (eg, the property will be used as a key in a dictionary or an input to a hash that should be stable), then that is an interesting direction! But I would propose that we should implement that independent of any "build-time-ness" requirement as it's more generally useful.
For example, the proposal offers an example:
protocol NeedsConstGreeting {
@const static let greeting: String
}
It's not clear what @const
actually adds here. Wouldn't just plain static let greeting: String
express the interesting part? You could even use StaticString
to cover the other axis too.
I suggest the authors expand the proposal with a more concrete example that actually relies on the compiler doing something interesting at build time for the protocol property use-case.
I assume there's something which will be announced in two weeks which relies on this feature? If so, the timing of this review is really awkward. If not, then I really struggle to see the point. The examples of uses for it are extremely underwhelming and very much look like a solution in search of a problem. "The compiler might find a way to optimize based on this" is the sort of thing that makes me think that we'll have a @const2
in the future with slightly different semantics that the compiler actually can use for optimizations.
The rules around this feature seem to resemble a hypothetical ‘pure’ function annotation, which would denote functions free of side-effects.
I'd like to see the proposal authors consider the interplay with pure functions as a future direction. It seems there's an opportunity for calls to pure functions with only const parameters, to be a valid const expression itself, if we do this right.
For me the logical use cases of build time constant values is relevant when using perfomance sensitive data structs, for example a Vector or Matrix having a compile time known size lets the compiler un-role loops for numerical operations on these data types, the compile could even auto vectorise expressions using such data types leading to massive perf improvements.
From the "Supported Types" section:
[...] The current scope of the proposal includes:
[...]
• Array and Dictionary literals consisting of literal values of above types.
•Tuple literals consisting of the above list items.
This list excludes tuple literals inside array/dictionary literals, and nested collections. Is that simply a miswording, an oversight, or an intentional limitation? If the latter, could it be explained?
Additionally
Enum cases with no associated values
(Emphasis added.) Why is const nesting not permitted where the associated value types are on the Supported Types list?
As a newcomer to Swift, but with two decades of programming in other languages I was a bit blown away by the abundance of keywords, especially those that are in my opinion more stand in the way than help. “Guard” version of “if”is one example. I think that the proposed @const or its variants will have the same impact on learning and consequently popularity of Swift.
More keywords - less elegant language.
Sounds like you have lots of experience. What specifically do you suggest instead?
I only saw mentioning that this feature could be potentially useful. If the purpose is to generate a more efficient code then we should not ask the user to help with the optimization. Instead we could add a compiler directive/function that would determine if the constant is initialized at the compile time and the initialized value then we can achieve the same (even better) result without complicating the language.
Something like:
func Clamp(val, min, max) -> …
{
if CompileTimeDefined(min){
let Min = GetComplieTimeValue(min)
…
}
}
So only a small number of expert users would have to learn about CompileTimeDefined/GetComplieTimeValue while the rest of us mortals would not care.
Slight miscommunication @Panajev . In this example, i
is the imaginary unit. Presumably all the constants from the Numerics and Complex packages would be marked as @const
if this proposal is accepted
Thanks for the clarification, I was thinking it would have to be as the rules around built time evaluation are necessarily infectious :).
WWDC didn't reveal anything which obviously depends on this, so this proposal still seems like a solution in search of a problem. Since my view on this is that I think the motivating use-cases are extremely uncompelling, I thought it might be worth writing up why they don't excite me.
Enforcement of Compile-Time Attribute Parameters
I don't understand why requiring the upper and lower bound of a clamp to be a compile-time value would be a desireable thing. It certainly isn't addressing any problem I've ever had. If the compiler was able to optimize away the runtime storage of the bounds then that'd be exciting, but implementing that isn't actually part of the proposal.
Similarly, accidentally inconsistent runtime values for a serialization key just isn't a bug I've ever seen happen, and so a feature to prevent that is not very useful.
Enforcement of Non-Failable Initializers
Compile-time validation of string arguments would be wonderful, but that isn't part of this proposal. Without that, this is just removing a single character from the call site. I'm not convinced that removing that character is even a good thing to begin with.
Facilitate Compile-time Extraction of Values
The specific example of using this for Package.swift sounds like a hilariously overcomplicated way to turn a turing-complete config file into a declarative config file. I can see some advantages to it compared to just using a normal config file format (e.g. Xcode doesn't have to implement autocomplete and syntax highlighting for that format), but it also preserves the giant downside of Swift being very slow to compile. Skipping codegen and executing the package doesn't speed up the part of package resolution that causes performance problems.
The broader idea of having some sort of API to inspect the AST at build time to generate code or something is exciting, but @const
seems like a very small piece of a very large project that wouldn't need @const
to be useful.
Guaranteed Optimization Hints
The obvious question here is if the hints are actually useful, and it's not obvious to me how they would be.
Is the assumption that @const
will result in more inlining? Or that the compiler will specialize the function for each value of the @const
parameter? The latter is something that we've done in C++ plenty of times (shifting the parameter from a runtime argument to a template argument), but I'd have a lot of questions about how exactly that'd work and I'm not sure that something like @const
is the right way to do it. A more explicit @specialize(where: value == 1 || value == 4 || value == 8) func foo(_ value: Int)
might be better for that sort of thing.
Given that implicit conversion from @const T
to T
is very natural, I really think we can consider adopting @const
into the type system directly. I agree that this adds some complexity, but it will enable far more power.
Consider the example given in the proposal:
typealias CI = @const Int let x: CI?
What is the type of
x
? It appears to beOptional<@const Int>
, which is not a meaningful or useful type, and the programmer most likely intended to have a@const Optional
.
How to determine the type of x
here?
We can force generics types to add explicit declaration (eg. by attribute) if they accept @const
type arguments. In the Optional
case, it should choose not to do so because that makes no sense.
Then the case above will emit a compiler warning saying that Optional
doesn’t allow Wrapped
to be explicitly @const
, so the type of x
is still Optional<Int>
.
But wait? How to get an @const Optional
?
Generic types can provide a build-time evaluated initializer, that takes an explicitly @const
value, and return an explicitly @const T
. Build-time evaluable overrides will always be preferred.
That is, types of the following declarations will be resolved as:
let a = 1 // a: @const Int (Swift 6)
let b: Int = 1 // b: Int
let c = Optional(x) // c: @const Optional<Int>
let d: Int? = a // d: Optional<Int>
let e: @const Int? = x // e: @const Optional<Int>
let x: @const _ = 1 // The exact alternative to `@const let x = 1`
Will it introduce source breakage?
Explicit type annotations are still taken as-is, and @const
needs to be explicitly marked. It has the least combination priority so @const _
is valid.
Since @const
values can fit everywhere non-@const
s are accepted, there shouldn’t be any source breakage. Not implicitly inferring @const
in Swift 5 is simply for preserving behavior of existing codes.
Implicit @const
will largely push the adoption of build-time evaluation as we can have @const
types in much more places without explicit annotation.
Proposed @const
conversion rules
- For every type
T
,@const T
implicitly converts toT
. - For value type
T
with stored propertyprop: U
,prop
on explicit@const T
has type@const U
. - For reference type
T
withlet
propertyprop: U
,prop
on explicit@const T
has type@const U
. - For every type
T
with computed propertyprop: U
that has a build-time evaluable getter,prop
on explicit@const T
has type@const U
, otherwise it has typeU
.
One more thing…
If @const
is emitted into the type interface, we should consider if it should be named const
as a modifier instead of an attribute.