Retain overflow checks when compiled with `-Ounchecked`

The -Ounchecked compiler mode treats runtime overflows as if they were preconditions that well-formed code would always avoid. This is almost universally not the case, and it shouldn’t disable them.

A separate flag for disabling overflow checks and only overflow checks could be added to replace it, such that current behavior could be trivially replicated by combining the two flags.

Motivation

Swift prioritizes safety, in the sense of avoiding undefined behavior, whenever possible. This is implemented through a hierarchy of mistakes and errors (not to be confused with thrown errors, which are actually business as usual):

  1. Compile-time errors (checked during compilation)
  2. Assertion failures (checked during debug runs)
  3. Precondition failures (checked during standard -O runs)
  4. Fatal errors (checked during all runs)

Of these, the first three are only encountered in code that is incorrect. Were it not for technical limitations and other practical issues, all of them would be compile-time checked. To quote Swift’s documentation:

Failure to satisfy that assumption is a serious programming error.

If code is written correctly all of the first three checks are redundant. Anyone wishing to maximize runtime efficiency, therefore, could consider using -Ounchecked.

In practice, few use that compiler mode right now, as there is an exception to that model that causes -Ounchecked code to risk undefined behavior even if there are literally no programming errors: integer overflow checking.

Arithmetic operators in Swift do not have preconditions stating that the result must not overflow, and few explicitly check for overflow before using them. Furthermore, architecture-dependent typealiases like Int are basically impossible to use without theoretically risking overflow.

Overflow is an issue that cannot be ruled out, but is also not worth recovering from or working around[1]. This makes its position in the error hierarchy clear: #4, fatal error.

Proposed Solution

Overflow checking should be treated like a fatal error, not a precondition. This means that it is kept regardless of optimization mode.

Precondition checks, such as accessing an Array with an invalid index, would continue to be eliminated.

Effect on ABI stability

None, as code that is already compiled would be unaffected.

Effect on API resilience

Minimal: Any documentation that states overflow checks are not performed in -Ounchecked builds would need to be updated, but no interface would be changed and it is unlikely to affect almost any usage of these operators.

Future Directions

In the interest of preserving the current meaning, undefined behavior and all, for those who prefer it, it may be worth adding an additional compiler flag that toggles overflow checks and only overflow checks. This could be used in combination with any optimization mode, with the caveat that the resulting binary cannot be considered an accurate representation of the original Swift code.


  1. If a programmer wanted to handle overflow, they'd be using the methods that actually report it. ↩︎

2 Likes
Note on Preconditions

Violating a precondition is incorrect regardless of whether a declaration is prefixed with unsafe: it is the responsibility of the caller to rule that out, even if it would be caught in release builds.

unsafe-prefixed declarations have preconditions that are not checked in release builds. This may mean they are checked in debug builds, or it may mean that no check exists.

As a result, they shift even more responsibility to the caller. They aren’t actually meant to be used differently, though: callers are always required to rule out precondition violations before making them.

Most unsafe alternatives are added as a performance optimization for release builds, particularly when checking a precondition imposes significant runtime overhead. They’re especially useful for internal code, which often wants to avoid repeated precondition checks.

Regardless of whether a declaration is marked unsafe, correct usage remains completely safe even in -Ounchecked builds. If no mistakes have been made, the only barrier to using or even distributing such binaries is the lack of overflow checking this pitch seeks to fix.

I'm not sure I agree with this; IMO, -Ounchecked is a performance tuning tool only. I would never dream of shipping binaries compiled with -Ounchecked.

For example, while optimising WebURL, I found that the performance difference between -O and -Ounchecked builds was around 25-30%. I profiled the unchecked binary using Instruments and found the areas with the greatest difference, then I went and examined those areas in more detail - finding out which runtime checks were causing the biggest performance issues, and to what extent they might be safely eliminated. Not all of them can be, and I think safety is more important than performance (especially with generics, where you don't know precisely what the implementation is going to do).

But anyway, after examining those cases carefully, the difference now is about 5%, I think. Like other developer tools, it provides useful insights and sometimes there are things you can do with that information.

But that's about the only legitimate use I can think of; -Ounchecked is just one of the tools that developers have to understand the performance of their code, but you shouldn't rely on it in production or expect users to compile with -Ounchecked. That would be an immediate red flag, and I would straight refuse to use any library which even suggested I should do that. It's such poor judgement that I probably wouldn't use anything else from that developer, either.

So no, I don't think it's worth making it "safer" by adding more checks back. It's a performance-tuning tool, and not suitable for production use IMO.

12 Likes

If something doesn’t work in -Ounchecked, there is a serious problem with the code. There’s no reason to prohibit its use, just as there is no reason to prohibit the use of debug mode.

To me, saying that -Ounchecked should be avoided is as reasonable as saying -O should be avoided: it tells me that the developer is not convinced they haven’t done something extremely wrong.

If you don’t believe me, could you think of something besides overflow that is checked in -O, isn’t checked in -Ounchecked, and shouldn’t have been caught before it was ever run outside -Onone?

I strongly disagree; software is complex. It might be worked on by multiple people over time, and it's extremely difficult to ensure that everything is safe over the years and across refactoring.

That's why things retain/release bugs exist with manual reference counting, and why memory safety issues exist in even the largest, most important C/C++ projects that millions of mission-critical projects depend upon, created by enormous teams of expert developers. It is not feasible to ensure that everybody understands absolutely everything, or that there are no bugs, and the consequences can be severe. Bugs happen, and IMO, part of being an experienced developer and creating a reliable library is having the humility and foresight to try and guard against the worse consequences (undefined behaviour) if they do occur.

And with generics, you literally cannot have a complete understanding of what is executed. Somebody may feed a buggy Collection to your algorithm, and it may have different indexes each time you traverse it, or it may produce a different number of elements each time. It isn't about the algorithm developer not being convinced they haven't done something wrong; it's that they can't entirely rule out bugs because they allow other developers to inject code in to the algorithm.

10 Likes

I think there’s a widespread and (apart from overflow) mistaken assumption that -Ounchecked is like the fast-math flag common in compilers for C and other languages: a mode that breaks the normal rules and sacrifices correctness to enable optimizations that aren’t actually allowed. That is not what -Ounchecked does. With the sole exception of overflow, it bends no rules, breaks no standards, and cuts no corners. All correct Swift code, no matter how low-level or esoteric, will always work the same way it usually does. Incorrect Swift code compiles only because the compiler is unable to stop it from doing so.

As you said, it is rather naive to actually trust all the code in your program. But the vast majority of code in the vast majority of languages, including C/C++, never ship with the checks that -Ounchecked removes. And there is a reason that Swift Package Manager prohibits targets used by other package from changing build flags.

Most importantly, Swift mandates that the caller avoids violating preconditions. Any checks made by said caller are not removed, and in release mode render the callee’s checks redundant. Regardless of compilation mode, it is unacceptable to ship code that outright ignores preconditions.

That has absolutely nothing to do with -Ounchecked. If you can't trust that protocol requirements are met, you can’t actually use Swift to do anything in the first place. Preconditions wouldn’t even stop that scenario: the “precondition” was valid conformance to Collection, and the compiler actually checked that one for you.

In fact, because -Ounchecked is so widely ignored, a lot of Swift programmers seem to ignore preconditions entirely in favor of assertions and fatal errors. Neither are different between -O and -Ounchecked, rendering the entire mode completely pointless even as people fill libraries with features that are actually not a defined part of the language.

Swift is deliberately designed to improve on C/C++ by introducing those checks and hence eliminating lots of instances of undefined behaviour. If you don't want the checks, use one of those other languages.

And I don't mean that to dismiss what you say; I mean that those checks are an intrinsic part of Swift's design. If you disable them, it's not clear that you're even writing "Swift" any more.

The compiler only checks that functions exist with their expected signatures; it cannot check that they meet the semantic conditions of the protocol. And yes, it does relate to -Ounchecked, because it means your internal reasoning may be violated at any point due to code you (as a library author) cannot see.

Even in those cases, it is unacceptable to violate memory safety. Yes, even in the face of bugs, and yes, it's hard. That's why generic code (and again, all code, actually) should not be shipping with -Ounchecked.

But anyway, I think I've made my point. I'll leave some room for others to also share their opinions :slight_smile:

3 Likes

-O disables checks on internal reasoning, not -Ounchecked. Are you arguing that everyone should only use -Onone?

Pointers by and large don’t have any checks. We assume their use does not violate memory safety, because we assume that all such usage is correct.

FWIW, I have a hard time digesting this pitch. It is written as if the current behaviour is obviously incorrect, so it's pretty hard to differentiate arguments from opinions.

6 Likes

Could you elaborate on what needs to be made more clear?

A few things I can point my finger to, in no particular order:

  • I don't think it's ever made explicit that Swift prioritizes UB safety* (over what?). A few of the existing UBs are also detectable at the cost of performance, so it doesn't seem like Swift's singular goal.

  • This makes all arguments (if any) unilateral. There's no tradeoff to consider if there's no downside to it. Surely you don't mean that, and you probably considered Hyrum's law or performance impact at the very least.

    • On the same note, it's not clear whether the performance impact is low enough that we can enforce this even on -Ounchecked. This is something skeptics would most likely look for.
  • We can remove many hyperbolic/opinionate words; incorrectly, demonstrably, glaring, I doubt, obvious, etc.

Overall, this reads to me like a request for a bug fix, but on things that could easily get an it works as intended reply. It's not even clear to me from the reading that this is causing people's problems (and if not, why are we talking about it).

* On an unrelated note, the term safety in Swift has always been about memory safety, so it's also quite hard to search for that.

6 Likes

That’s a fair criticism. I’ve trimmed that language from the pitch.

Could you give an example?

That’s what unsafe-prefixed alternatives are for. If checking preconditions is a significant performance burden in practice, you can add an unsafe version that uses assertions instead. That effectively adopts -Ounchecked behavior even in -O mode.

Enforce what? Skipping overflow checks? I’m proposing a separate flag for them if there’s demand, which would also allow them to be disabled while retaining precondition checks if desired for some reason.

It’s not a bug fix, since the current behavior matches what is documented. It’s a request for a change.

Safety usually relates to memory safety, since that’s the most common source of undefined behavior, but it specifically refers to any invalid state.[1] When the Swift Standard Library explicitly says that something is a “serious programming error”, that’s a rather clear indication it’s considered undefined behavior regardless of compilation mode.

Swift mandates that all preconditions are checked unless the unsafe prefix is used, and that makes debugging much easier and removes guesswork about where invalid state may leak in. However, since code is supposed to avoid undefined behavior even in -Ounchecked mode, failing a precondition still isn’t considered “correct”. That’s why the compiler refuses to acknowledge preconditionFailure as exiting scope, regardless of compilation mode.


  1. This definition can be found in a WWDC 2020 talk, among other places. ↩︎

This pitch states as fact a great many perspectives that are, well, not recognizably true. For instance:

“If code is written correctly,” a great many things about Swift’s design choices are irrelevant. Thing is, to a first approximation, code is written incorrectly.

As @Karl points out, users—or if we want to be precise, users other than those who are infallible beings—actually shouldn’t consider shipping their product compiled with -Ounchecked “to maximize runtime efficiency.”

If I recall, it has actually been suggested by some working on the compiler that -Ounchecked should go away altogether. However, the use case that @Karl describes (as a development tool) is quite persuasive and it would be quite a loss not to have that.

13 Likes

I don’t see why integer overflow checks shouldn’t be considered a type of precondition check. A precondition check tests whether the program is in a valid state and terminates the program if it isn't (unless you're using -Ounchecked). I don't see how integer overflow checks are any different.

The main use of -Ounchecked AFAIK is to determine the performance cost of precondition checks in code. If we started preserving integer overflow checks in -Ounchecked mode, then we wouldn't have a way to determine the performance cost of those checks.

If you want to leave in checks, I would avoid using -Ounchecked — "unchecked" is literally in its name.


I think it's important to note that the documentation says it's only a serious programmer error in -Ounchecked mode. In -Onone builds and -O builds, failure to satisfy a precondition is not a serious error (it is a runtime error, but one with very well-defined behavior as opposed to undefined behavior).


There are plenty of algorithms where overflow can be ruled out. For example, plenty of Collection algorithms avoid overflows even when working with types like Array, which uses the standard +/- operators for its index(after:) and index(before:) methods.

I think relying on the default crash-on-overflow behavior is undesirable for most programs, and that recovering from the error or working around it is the best thing to do in most cases. If a programmer expects the possibility of overflow at a certain point, I think it's likely that they would likely try to detect it themselves and give a detailed error to the user instead of relying on the default Swift behavior.


What does this mean? Can you elaborate on this?


IMO this is a documentation issue, not an issue with -Ounchecked. A note describing this behavior should be added to the documentation for each of the integer operators that are affected.

This is why -O is known as “release mode”.

Let me give an example of when I feel -Ounchecked might be useful in production: let’s say you wanted to write extremely performance-sensitive code. Maybe for an embedded system, maybe for an operating system kernel, something like that. Everything is tested thoroughly, even audited to ensure correctness. At that point, without any actual changes from normal Swift code, you could simply remove the checks that aren’t needed in runtime.

Right now, a lot of people seem to do this by instead avoiding precondition checks in nominally safe declarations, even in -O mode. I feel this is considerably more harmful.

This is why I view integer overflow as a fatal error, not a precondition: think of it like an implicit stub for “what to do if an overflow occurs”. Like any stub you haven’t implemented, you use fatalError.

I consider it almost impossible to rule out overflow when incrementing an Int, as the native bit width of the architecture is not set when code is being written. It could resolve to Int8 for all you know.

If you prefer to avoid overflow checks in a function, you should simply document and check a precondition for acceptable input, then use wrapping operators.

preconditionFailure(_:file:line:) does not return Never, even though it could be conditionally compiled to do so. This is because code is expected to work even in -Ounchecked. If you actually expect to fail a precondition, it isn't a precondition: you should be using fatalError.

On what platforms Swift's Int is Int8 / Int16 ?

None. Today. That's Jeremy's point. Swift for Arduino might need that. (I don't know the state of embedded platforms, but the idea is the same.)

2 Likes

I see. I confess to YAGNI so that's invisible to me as of today. The quoted statement was written in a way as if we had this already. As for what happens tomorrow, or a year from now... for me that would be similar to a phrase: "It could result in world's end for all you know".

This claim doesn’t make sense to me.

If you’ve audited your own code and tested thoroughly to ensure that precondition checks aren’t needed, why wouldn’t you then remove those specific checks in all compiler modes? Sounds to me like “a lot of people” are doing exactly the right thing.

Certainly, you shouldn’t strip preconditions out of inlined standard library or third-party code that you didn’t audit by using -Ounchecked in production.

2 Likes

Because from that point on working with that codebase would become a nightmare. You are still going to want debug mode and release mode in the future.

This is similar to asking why people compile in release mode instead of removing all assertions.

There are two possibilities here: you either have the source code or you aren’t compiling it (with -Ounchecked or otherwise). There is literally no overlap there.