10 years on, what would you change about Swift?

Wow, that's good news) I was convinced by lots of posts on the web saying so.

Yes, I agree with you, this explicitness is what I don't like. In my opinion, it makes code less readable and breaks abstractions that Swift introduced and promoted earlier. Again, it's not about an attempt to force anyone think this way, this is my own subjective impression and the reason of discouragement.

FWIW, I feel that this talk (particularly the "start with a protocol" clause) got heavily misrepresented in recitals to mean that, for every single feature / entity, there should be a protocol. This leads to very boilerplatey codebases where the protocols are nothing more than header files, which is ironic because Swift and other languages of its cohort have the absence of header files as one of their selling points. The compiler will also just strip all of that if it can determine that a protocol only has a single conforming type.

In practice, if you develop a binary (as opposed to a library), especially the GUI layer, you hardly ever have to reach to protocols because you barely have a case for a truly generic function or type over an unknown number of potential conforming types.


Interesting, I am using a development toolchain and it does error if you omit some or any. Is this planned to be changed back in that case?

There will sadly always be people who don't understand what they're doing and add unnecessary abstraction thinking their code is somehow better for blindly implementing some pattern with a fancy acronym :confused:

1 Like

That’s[my previous statement] not true in case a protocol has an associated type, in which case it has always been required to specify any since Swift allowed that. Or maybe you turned on upcoming feature of ExistentialAny (but I think you’d know that :slight_smile:).

Oh sorry, I think I was misremembering a discussion from a few years ago [Discussion] Eliding `some` in Swift 6 and for some reason thought that actually happened :slight_smile:

I would be interested in discussing it, yet don't think this thread is a right place. If you create a separate topic in Using Swift space, I am +1 to jump into discussion.

1 Like

Totally agree about another place. Thank you again for pointing out that Swift 6 is not going to require explicit any, that's the best option for my case. It hasn't changed my opinion about new versions of Swift, though, :sweat_smile: but anyway, it's useful information :+1:

Sorry for flooding here.

I'd like to hear your (hopefully unbiased!) thoughts on this idea. The pros – much greater flexibility of making ABI compatible changes. The cons are easy to identify too (e.g. 10x distribution size and/or the need of some "high level byte code" or the need of "trim", or increased compilation and/or distribution time, etc) – basically more pressure on local tooling and remote distribution infrastructure and less pressure on the language itself. I can't judge which one is better (for some early versions of Swift or another language) and maybe there's a parallel universe somewhere where this version is chosen and I am writing a post where I'm suggesting an alternative of throwing it away and introducing "unknown default" & "frozen" instead.

By "one runtime only on the target device", do you mean that ABI is tied to OS version, and any time the ABI changes, you rev it with the OS? And if so, and the ABI changes, does that mean that customers need to go and redownload every single app they own (after the developer has gotten a chance to recompile it for the new ABI and upload to the App Store)?

Alternatively: if the OS is expected to support multiple ABI versions, does that mean that it ships with 10x versions of every single system framework that uses Swift, so customers can download apps with different ABIs?

1 Like

I'd say there is a conflict of visions (or values) at work here, and I have no idea for a better way to resolve it. Maybe someday, there will be a layering of features so, at the bottom there is simplicity, and at the top, a huge menu of choices to nicely fit every situation. Both @Slava_Pestov and @Douglas_Gregor have done this with rewriting major portions of the implementation, and I'm sure others have, too. So some times, things get simpler at some levels. Sigh, every improvement takes time and money, exacting an opportunity cost.
Crazy idea: What would happen of the users of Swift funded the development of Swift? (at least partly) But Swift needs to serve Apple's needs. And academic languages must serve the needs of granting agencies. Beats me, Batman!

For example: language changes are coordinated with OS version changes, OS consults with AppStore and updates the apps during update procedure, byte code is used on Appstore to get the new binary (which is cached to do this just once).

Ohh I want Closure labels back too!

1 Like

The ideal happy path here sounds nice, but I think that in practice, this would be a monumental cost to leverage on customers.

  1. Updating apps during an OS update means updates (ideally) require an internet connection
    • OS updates can already be quite lengthy at times, and on phones, leave you potentially unreachable in case of emergency. Lengthening the OS update to redownload all available apps doesn't improve the user experience here
    • If you're offline during the update, the update still has to go through, leaving you to redownload the apps later, which is also a frustrating experience
  2. Most proposed ABI changes are unlikely to benefit from an intermediate bitcode/byte-code representation, and many changes will require developers to recompile (yet another version of) their app
    • Bitcode, when it was supported, was somewhat more flexible than compiled binary, but it wasn't magic: it embedded some metadata into mostly-compiled code that allowed for some amount of optimization/adjustment for processors, but it was still architecture-specific and too low-level to allow "recompilation" in a general sense. IIRC, it was used to smooth out the transition for Apple Watch from arm32 to arm64_32, but that was still a relatively constrained migration
    • Bytecode also doesn't help with transitive dependencies — e.g., you have a 3rd-party SDK that ships as a pre-compiled dynamic framework which can't be recompiled on the fly

This means that if you're offline during the OS update, or attempt to update before a developer has gotten a chance to update their app, you could be stuck with an effectively-bricked device missing apps you crucially need. (Whether for work, medical devices, important social connections, etc.)

As a customer, would you take that risk for absolutely no discernible benefit on your end? As a developer, would the promise of new ABI versions significantly-enough outweigh having to support many more OS versions back (each with their own incompatible ABI, presumably without the "shiny new bits")?

I personally don't think I'd make this tradeoff.


Thanks Itai!

Let me run a modified version by you, it addresses most of the raised issues:

  • The bytecode chosen is more "high level" than what we have/had before: the version that could survive the ABI changes we want to be flexible about.

  • The bytecode "compiler" is shipped with the OS. Presumably it is much lighter than Xcode (it's more akin to Java's JIT compiler).

  • Every executable on the device (including libraries) is shipped with the corresponding bytecode.

  • After a major OS update the version of compiled binary is no longer compatible with the new language version. Thus the first time you try running the app – the old binary is deleted and the new binary is getting recompiled from the bytecode – this happens on the device and doesn't require internet connection. The first run after update will be slower.

I think this would make the user experience significantly more palatable, though there are still some considerations:

  • This hinges on coming up with some sufficiently-high-level representation that gives more ABI flexibility, the specifics of which would determine how much flexibility is possible. On a spectrum from source code → SIL → LLVM IR → binary, you'd probably want to land somewhere between source code and SIL, but how close to either has some pros and cons:
    1. SIL has the benefit of being a target-independent representation that exists today, though may still be too low-level to represent some of the ABI changes I've seen pitched over the years, which may transform how code is generated quite significantly. Java and the JVM, for example, have the benefit of "merely" needing to adapt code to specific architectures, but also don't typically deal with utter source- and ABI-breaking changes (examples of pitches of which are easily found here on the forums)
    2. Source code has the benefit of being maximally-flexible and accommodating of significant ABI changes, but many folks would probably be pretty uncomfortable shipping their raw source code around
    3. Something intermediate would require designing and coming up with something brand new, and inserting it into the compiler pipeline
  • There are significant resource implications for having every single device out there recompile the same apps over and over again from a power consumption and environmental perspective: right now, my company's app is compiled once and downloaded by hundreds of thousands of devices; I can't imagine the resource costs of instead having each and every one of those devices instead compile the app locally
  • This still doesn't resolve the issue of precompiled 3rd-party frameworks, which are a real-world concern. Having to deal with these frameworks when you have no other choice can already be a massive pain, but having to ship one-per-ABI because they can't be recompiled locally sounds like an incoming migraine. (Said as someone whose team has to suffer poorly-written, precompiled 3rd-party frameworks with no alternative)

In all, I think this is an interesting exercise, but at the end of the day, ABI stability wasn't chosen arbitrarily. Yes, it absolutely does at times feel frustratingly self-limiting, but the alternatives are typically far worse. There's a reason that the vast majority of languages either commit to ABI stability, or ignore it altogether (with no real in-between).


I always leave these threads open a little longer than I probably should, but I think this one has run its course.