[Pitch #2] Build-Time Constant Values

Actually, I find it very useful to be able to express this when defining a protocol for the API of a remote service or distributed actor. It would let us know if we can cache and store the value instead of repeatedly making an expensive remote call.

3 Likes

For a value that is going to be stored in DATA_CONST, wouldn't it be far more efficient to treat it as a distinct type with its own simple flat layout (instead of having pointers to storage buffers, etc)? That is what I was trying to get at in my other post.

Thank you for the explanation Ben. Now I understand why I should care, I actually do care and even have a use case for it.

If I understand it correctly, all of the values are constant at runtime but can be dynamically generated (now or after later proposals) at build time. In view thereof I would prefer a name like buildtime. That makes it clearer when this stuff gets evaluated.

Though once it clicks const indicates more tits effects on performance and runtime behaviour. Having to think and being reminded about that might be better, once a user gets over that initial step.

So both kind of names could work I guess.

For those constantly switching between swift and c++ it could be more annoying. I don’t think that’s a sufficient argument though against const.

##my2cents

Yes I can see that, but when reading source code that’s not what I care about. Other languages constructs can also have the same treatment no? (Global strings lets) but that’s not what we care about when reading the source. IMO the semantics of this and the naming should reflect something closer to the programmer, and for the programmer this is about ‘compile time’.

3 Likes

This list notably excludes tuples of the integer, floating point, string, & enum options. Is that an intentional omission or an oversight? Would it be meaningfully harder to add the tuple case? SwiftNIO has at least one major use-case where having tuples would be extremely valuable.

3 Likes

I like this proposal!

The only comment I have is that I like language features that encourage "fast by default" writing. In C++, it's annoying to see constexpr static written everywhere.

I understand that in C++ and Swift, the compiler is good at optimizing and you don't need to use @const everywhere, and that's fair. In the previous proposal, one reason I liked const instead of let , is because it allows me to use "build-time constant values" as a default.

When it comes to protocols and libraries, overusing @const is certainly an anti-pattern and should be avoided. Again, most of my ire comes from seeing a bunch of ugly constexpr static and preferring something like const in Rust.

3 Likes

This was an oversight in the proposal. We do support tuple literals consisting of other supported literals (int, float, bool, string). Will amended the proposal, thank you.

Boolean literals are also supported and should be covered in this proposal, will add, thank you.

8 Likes

For now, the only way to express that something is compile-time is # sign. Why not to continue using it for this purpose?
If we have #if, we can add #let for example.

I also see the problem with the further development of this feature. If const func (or #func) is intended to use in compile-time order, we will start to provide two versions of the same method for compile-time and runtime execution, just like C++ developers do with const methods.

So, maybe available(compiletime) is longer but is better conceptually.

7 Likes

So just as an update, some things that may not have been obvious to readers of the pitch but that I found when working on the NIO conversion:

  1. Currently even simple expressions don't work. For example, 0 | 1 is not acceptable. This is not the fault of the pitch, it's very clear, but this may require some removal of nicer syntax if you were thinking doing this with, say bitfields.
  2. As a corollary, complex expressions obviously don't work. This means that OptionSet is not capable of being expressed using @const. It's not the end of the world: you can just store the rawValue and load through it. But given that you also can't use bitwise ops you end up losing some expressive power.
  3. You cannot use a @const static let as part of the initialization expression of a @const static let: that is, this feature does not compose. Again, the pitch doesn't promise that it would, but it further inhibits your ability to be expressive at the initialization site. Putting this very clearly: this feature as-currently-implemented only supports literals, nothing else.
  4. For large const values the type checker becomes a problem. In the case of the NIO example linked above, this is a 5000-odd line Array containing (UInt8, UInt8, UInt8), expressed entirely in literals. On my Intel MacBook Pro this expression takes 25s to type-check. The improved dirty memory story is very hard to trade off for the explosion in compile time. This is not the fault of this pitch, naturally: it doesn't touch the type checker at all. But the places where this pitch might bring the most value are the places that are most likely to fall into the type checker performance pit.

I remain strongly in favour of this pitch: it's great, and it should land. But I thought it might be useful to provide this feedback to help shade in exactly how this is going to work in some of these practical cases.

13 Likes

I remain unconvinced that simple lookup tables such as the example in that NIO patch require const. In fact - I'd go further - if we continue to refuse to address this in the optimiser and require const for static data, it would be actively harmful and a serious expressiveness regression even from C.

I keep seeing it thrown around and taken for granted that const will solve this issue. The fact that these tables are not currently statically initialised is a compiler deficiency, not a language deficiency (see here).

And yeah, the compiler has some sort of super-linear behaviour for array literals. Compiling an IDNA mapping table with ~9000 entries takes >75GB of RAM and has to be killed, but splitting it in to 90 sub-arrays of 100 elements each takes 100MB of RAM and completes in an instant.

Again, that's another compiler deficiency, not a language deficiency. I find it highly, highly concerning that (AFAIK, across multiple threads on this forum where others and I have raised the issue), that distinction is not being made clear. It causes me to question whether the compiler team understand the issue we're all having here.

I don't think this is what NIO needs to fix its lookup table problem (or what WebURL needs for its lookup table problem, or what the standard library needs for its lookup table problem). I hope the compiler team, and the core team who evaluate this proposal, are fully aware of that.

8 Likes

Because the primitive types are implemented in the standard library, I believe even implementing the compiler change will require a language change that enables a way for standard library authors to indicate “it’s ok not to dynamically run this type’s initializer for values specified in source”.

Posting the following edits/additions to the pitch text here for the sake of clarity:


Forward-looking design aspects

Propagation Rules

Though this proposal does not itself introduce propagation rules of @const-ness, their future existence is worth discussing in this design. Consider the example:

@const let i = 1
let j = i

Our intent is to allow the use of i where @const values are expected in the future, for example@const let k = i or f(i) where f is func f(@const _: Int). It is therefore important to consider whether @const is propagated to values like j in the above example, which determines whether or not statements like f(j) and @const let k = j are valid code. While it is desreable to allow such uses of the value within the same compilation unit, if j is public, automatically inferring it to be @const is problematic at the module boundary: it creates a contract with the module's clients that the programmer may not have indended. Therefore, public properties must explicitly be marked @const in order to be accessible as such outside the defining module. This is similar in nature to Sendable inference - internal or private entities can automatically be inferred by the compiler as Sendable, while public types must explicitly opt-in.

Memory placement

Effect on runtime placement of @const values is an implementation detail that this proposal does not cover beyond indicating that today this attribute has no effect on memory layout of such values at runtime. It is however a highly desireable future direction for the implementation of this feature to allow the use read-only memory for @const values. With this in mind, it is important to allow semantics of this attribute to allow such implementation in the future. For example, a global @const let, by being placed into read-only memory removes the need for synchronization on access to such data. Moreover, using read-only memory reduces memory pressure that comes from having to maintain all mutable state in-memory at a given program point - read-only data can be evicted on-demand to be read back later. These are desireable traits for optimization of existing programs which become increasingly important for enabling of low-level system programs to be written in Swift.

In order to allow such implementation in the future, this proposal makes the value of public @const values/properties a part of a module's ABI. That is, a resilient library that vends @const let x = 11 changing the value of x is considered an ABI break. This treatment allows public @const data to exist in a single read-only location shared by all library clients, without each client having to copy the value or being concerned with possible inconsistency in behavior across library versions.

Effect on ABI stability and API resilience

The new function parameter attribute is a part of name mangling.
NEW: The value of public @const properties is a part of a module's ABI. See discussion on Memory placement for details.

Alternatives Considered

Placing @const on the declaration type

One altenative to declaring compile-time known values as proposed here with the declaration attribute:

@const let x = 11

Is to instead shift the annotation to declared property's type:

let x: @const Int = 11

This shifts the information conveyed to the compiler about this declaration to be carried by the declaration's type. Semantically, this departs from, and widely broadens the scope from what we intend to capture: the knowability of the declared value. Encoding the compile-time property into the type system would force us to reckon with a great deal of complexity and unintended consequences. Consider the following example:

typealias CI = @const Int
let x: CI?

What is the type of x? It appears to be Optional<@const Int>, which is not a meaningful or useful type, and the programmer most likely intended to have a @const Optional<Int>. And although today Implicitly-Unwrapped optional syntax conveys an additional bit of information about the declared value using a syntactic indicator on the declared type, without affecting the declaration's type, the historical context of that feature makes it a poor example to justify requiring consistency with it.

3 Likes

It is not the intention to imply this proposal introduces any new capability to place data in the read-only section of a binary. That particular topic came up in the thread mainly to provide background on the notion that "const" is a term of art closely related to this feature.

What's being proposed isn't about optimizing performance, so much as it is guaranteeing that the values are statically known for other semantic reasons. The motivations outlined in the proposal don't even mention memory placement (Artem is just adding something clarifying this) but rather focus on things like the URL.init use case.

There is a distinction here between:

  • optimizations that may happen, maybe even should happen (i.e. literal values should be put in read-only sections whenever they can be); versus
  • language features that allow someone to enforce behavior at compile time (i.e. that you cannot call a non-failable URL.init with any kind of dynamically-generated content)

Now, between these two bullets lies a space where it might be reasonable to say that once @const is available, follow-on work could be done so that the placement of values marked with it into particular sections of the binary could be guaranteed rather than hoped for. However, even if that was guaranteed, it would still be reasonable to want the compiler to also place non-@const values into those same sections too when possible.

What @const does allow for is specifying in code that a value must be constant, and therefore open to this optimization, something optimizations cannot do on their own even when we have them. This achieves things like ensuring that later refactors do not accidentally break the const-ness (i.e. by replacing a part of a constant expression with a runtime-calculated value without realizing the implications of doing that), or allowing frameworks to publish @const values in their .swiftinterface files. These will become more important over time, once the concepts @lukasa points out are the natural next steps such as simple expression composition or propagation.

7 Likes

I should add that I did not expect this proposal to "fix" our problem: our problem is very minimal. As our lookup table is purely internal, the cost of the dynamic initialization is dirty memory and a one-off initial payment to decode the base64. That's annoying, but it's not the end of the world. Once initialized the existing version and the @const-ified version generate identical code for lookups.

I did this only because this data is clearly @const, it's the kind of use-case that @const would want to be useful for, and so I could better understand the way the pitch felt to use and provide feedback from there. This is not a problem that is urgent for us to fix, and I have no particular opinion about the right way to fix it.

2 Likes

Right; it's possible to work around it. But even the idea of base64-encoding your static data to work around compile-time problems is quite novel and we really shouldn't expect everybody to do that.

I agree that @const is orthogonal to our current issues with static data. That's the point I was also making.

The more interesting question around this proposal IMO is if we're heading towards a C++-style model, because I think it is generally accepted that constexpr is a bit of a disaster.

Firstly, it's super-powerful. Almost everything can be constexpr, and developers are constantly trying to do more with it. One guy wrote a compile-time CSV parser, and there are far more complex things than that. C++20 even has support for constexpr allocations. Pretty much the entire C++ standard library is constexpr; I remember hearing once that the only thing that cannot be constexpr is getting the current time.

But for library authors, that's a problem. Because people are using it everywhere, it's difficult to predict when your users might want compile-time evaluation. And when you add it, all utility functions, etc - all the way down, it all has to be marked constexpr, too.

Remember the whole "what colour is your function"? problem with async functions? This adds yet another colour of functions. If your code is constexpr, you can be called either at compile-time or run-time, but if it isn't, compile-time evaluation is verboten.

The result is that constexpr is basically the default. Like noexcept. It has long been a meme.

(It actually gets worse - some APIs are only constexpr for certain inputs, so the entire value of the attribute is lost and you're basically back to relying on documentation anyway).

Now, this proposal specifically has been stripped-down so that it doesn't even include compile-time functions, but it does introduce the concept that compile-time evaluatability is part of a value's type. So I think that even if we accept this minimal proposal, we will have effectively signed-up to implementing constexpr as C++ has.

Maybe there is no alternative to that; I don't know. But I'd like to see some more discussion about it from the proposal authors: where is compile-time evaluation in Swift even headed? And will this proposal help or hinder us on the way there?

2 Likes

we will have effectively signed-up to implementing constexpr as C++ has.

Other languages seem to avoid those problems with compile-time evaluation. I doubt C++ is a good reference for pretty much any new feature the Swift team is considering adding.

4 Likes

Zig is really interesting, for sure, and just to be clear, what I'm saying is that I'd love to see a similarly ambitious plan for compile-time evaluation in Swift. Whenever we're trying to convince users of dynamic languages (Obj-C/Javascript/Python/etc) to try Swift, one of the main points is that it allows for better compile-time processing, helping them discover bugs earlier and write more robust code. It's clear that developers really value compile-time evaluation, so IMO it's worth investing and doing something bold (and I hope the core team agree).

But this proposal feels like proposing Sendable without any concept of concurrency, actors or async functions. It's so stripped-down and there's no big-picture strategy document, so it's hard to judge if even this might be constraining evolution too tightly. Is it even desirable introduce @const variables as a separate "flavour" of variables? Perhaps, but also maybe not...? :man_shrugging:

The worry is that if we focus on looking down instead of looking ahead, and just make tiny obvious steps forward, one day we'll look back and realise have all the same problems of constexpr.

7 Likes

Thank you for the clarifications. Now I see the scope is far more limited than I thought. But still it is a step in a direction that is very important to get right.

So, I echo Karl's question/concern:

2 Likes

I think that C++ is not comparable to Swift in this regard. There's massive survivorship bias in C++ regarding how much of constexpr you see because C++ libraries are simply not very good at doing anything that needs to interact with an operating system, and it turns out that when you don't interact with an operating system, almost everything looks pure.

When you start looking at the more integrated problems that Swift libraries attempt to solve, compile-time evaluation is applicable to a much smaller proportion of use cases. For instance, there's barely anything that can leverage compile-time evaluation in a UI component or in a networking library. I think that you're right that you can implement a CSV parser that is entirely available at compile-time, but I'm not really coming up with anything that would point to that being the best way to deal with a problem in the Swift ecosystem. If you have to consume csv input at build time, I think the way you'd go around in Swift is by creating a build tool that processes your CSV and generates something else. IMO, Swift's focus on library evolution and binary compatibility are good guards constexpr overreach.

Lastly, although it's true it creates function colors, the fundamental problem with async function colors is that old code (synchronous) can't call new code (asynchronous). This is the one direction that really matters because you can end up in a situation where you're stuck having to transition a ton of code to a new paradigm. The fact that new code (constexpr) can't call old code (not constexpr) is not nearly as much of a problem, because new code is effort that hasn't been spent yet.

2 Likes

The example given in the proposal is parsing a URL. I would argue that is even more complex than parsing a CSV file.

3 Likes