Static let vs static computed property, optimization differences

As an example context, let's assume that we are writing some performance critical code that is too slow to even test in debug mode, ie we must always compile with -O, even while testing/debugging. Furthermore, we want to have control over which checks are performed or not, as the checks themselves affect performance. We could use conditional compilation flags and preconditions to achieve this, ie:

    precondition(0 < x && x < w)



But it would be more convenient if we could set these flags directly in the relevant source code instead of as conditional compilation flags. Like this:

struct SomeThing {
    static let someDebugFlag = true
    static let someOtherDebugFlag = false

    func foo() {
        iff(Self.someDebugFlag, precondition: 0 < x && x < w)

The idea is that the entire iff(_: precondition: ) call will be optimizes away when Self.someDebugFlag is false since it is knowable at compile time.

Now, this turns out to work, but only if we use a static computed property instead of the static let constant for someDebugFlag, which is kind of strange and unintuitive (would be less surprising if it was the other way around). Here is a demonstration program:

import func QuartzCore.CACurrentMediaTime
import func Darwin.exit

func iff(_ condition: @autoclosure () -> Bool,
         precondition: @autoclosure () -> Bool,
         file: StaticString = #file,
         line: UInt = #line)
    if condition() && precondition() == false {
        print("CONDITONAL PRECONDITION FAILURE\n\(file):\(line)")

struct SomeThing {
    // We will change the following line and observe the results. See discussion below.
    static let someDebugFlag = true

    func foo() {
        var v = 1
        for _ in 0 ..< 7 {
            let t0 = CACurrentMediaTime()
            for _ in 0 ..< 10_000_000 {
                iff(Self.someDebugFlag, precondition: v/3 != v)
                v &+= 1234567
            let t1 = CACurrentMediaTime()
            print(t1 - t0, "seconds (checksum:\(v))")

func test() {
    let x = SomeThing()

Compiling (with -O) and running this program as is will look like this on my machine:

$ swiftc --version
Apple Swift version 5.1.3 (swiftlang-1100.0.282.1 clang-1100.0.33.15)
Target: x86_64-apple-darwin19.2.0
$ swiftc -O test.swift && ./test
0.009812853997573256 seconds (checksum:12345670000001)
0.00792994606308639 seconds (checksum:24691340000001)
0.00792142900172621 seconds (checksum:37037010000001)
0.007876771036535501 seconds (checksum:49382680000001)
0.00791895401198417 seconds (checksum:61728350000001)
0.00868810701649636 seconds (checksum:74074020000001)
0.008571372018195689 seconds (checksum:86419690000001)

And setting the flag to false:

    static let someDebugFlag = false

we get this:

$ swiftc -O test.swift && ./test
0.00026031804736703634 seconds (checksum:12345670000001)
0.00015434296801686287 seconds (checksum:24691340000001)
0.00015423703007400036 seconds (checksum:37037010000001)
0.00015423796139657497 seconds (checksum:49382680000001)
0.00015423900913447142 seconds (checksum:61728350000001)
0.0001542380778118968 seconds (checksum:74074020000001)
0.0001542380778118968 seconds (checksum:86419690000001)

Now, if we change the static let constant into a static computed property:

    static var someDebugFlag: Bool { return true }

We get this:

$ swiftc -O test.swift && ./test
0.009706033044494689 seconds (checksum:12345670000001)
0.009436139022000134 seconds (checksum:24691340000001)
0.009441407979466021 seconds (checksum:37037010000001)
0.010079252067953348 seconds (checksum:49382680000001)
0.010098619968630373 seconds (checksum:61728350000001)
0.009777670027688146 seconds (checksum:74074020000001)
0.00981698592659086 seconds (checksum:86419690000001)

And if we set the flag to false:

    static var someDebugFlag: Bool { return false }

we get this:

$ swiftc -O test.swift && ./test
9.892042726278305e-06 seconds (checksum:12345670000001)
1.330627128481865e-07 seconds (checksum:24691340000001)
3.597233444452286e-08 seconds (checksum:37037010000001)
3.1082890927791595e-08 seconds (checksum:49382680000001)
3.096647560596466e-08 seconds (checksum:61728350000001)
3.096647560596466e-08 seconds (checksum:74074020000001)
3.1082890927791595e-08 seconds (checksum:86419690000001)

That is much quicker, which is because the entire iff call is optimized away, which allows the entire loop to be optimized away too, reducing it to one addition and multiplication or something.

To summarize:

| A | static let someDebugFlag = true                 | 0.00800000 s |
| B | static let someDebugFlag = false                | 0.00015000 s |
| C | static var someDebugFlag: Bool { return true }  | 0.00900000 s |
| D | static var someDebugFlag: Bool { return false } | 0.00000003 s |

Can anyone explain why B doesn't result in the entire iff call being optimized away, as is the case for D (using a static computed property instead)?

Should we expect any significant difference at all between A and C, and between B and D?

(Note the slight (but consistent) difference in time between A and C.)


Interesting. Have you tried using a development and/or trunk snapshot to see if you get similar results?

Nope, only Swift version 5.1.3, and I'm currently on a slow connection so I will not be updating or downloading any snapshots in near time.

Okay. For B, I get this on master (with -O):

9.977957233786583e-06 seconds (checksum:12345670000001)
7.008202373981476e-08 seconds (checksum:24691340000001)
2.9103830456733704e-08 seconds (checksum:37037010000001)
2.60770320892334e-08 seconds (checksum:49382680000001)
2.7008354663848877e-08 seconds (checksum:61728350000001)
2.60770320892334e-08 seconds (checksum:74074020000001)
2.514570951461792e-08 seconds (checksum:86419690000001)
1 Like

Ah, looks like it has been fixed on master then!