A case for postponing ABI stability

It would mean for Apple (and others who'd distribute compiled frameworks) to maintain several code bases of the same framework given that they would need to maintain backward compatibility and hence wouldn't be able to use new language features, etc. It's IMHO not that much about the technical constraint of having multiple binaries within the framework bundle as much as maintaining the code in a way that would compile under all Swift versions you'd like to support.

I haven't spend much time thinking about the implications, but wouldn't it be feasible to freeze old frameworks completely when a new Swift version is added?
After all, bundled code isn't updated at all, so I see no downside here — actually, it only adds flexibility that could be utilised when critical bugs are discovered.

I wouldn't expect that I can mix language and framework versions freely.

Rather than Apple have to commit in perpetuity to ship all relevant versions of the frameworks, one could imagine more of an app-thinning/install-time optimization: “thinned” versions of apps would be built and signed without the shared system frameworks, but with dependency information recorded. At install-time, the app would be installed, and the working set of required shared frameworks on the device would be updated with any needed dependent frameworks. Thus the only the set of frameworks required by installed apps would be present on device.

This would require more app store, install-time, and perhaps dynamic linking, infrastructure, but would seem to solve the problem in a way that wouldn’t require ongoing development resources be applied to old versions.

I'm thinking in the same direction… but although this isn't trivial, it still sounds to simple to be a solution for the dreaded problem of unstable ABI, so I can't fight the feeling I'm missing something important ;-):
What is the main motivation for ABI stability?
Sure, it would reduce app size, and this could be achieved with with this approach as well — but nowadays many apps are so bloated that 10MB make no real difference.
Declaring a stable ABI is also a statement of maturity (I don't care much about that, but others might feel different), and there might be other goals I'm not even aware of, and which cannot be reached with "tricks".

Maybe so - but IBM solved this very problem along with release to release binary compatibility for C++ and a number of other languages twenty years ago with the System Object Model (SOM).

I'm not arguing for its adoption per se - but good ideas are always worth stealing and there was some solid engineering in there.

···

Sent from the road

On Jan 27, 2017, at 09:19, Tino Heth via swift-evolution <swift-evolution@swift.org> wrote:

I wouldn't expect that I can mix language and framework versions freely.

You can't simply freeze an old version of a framework. Many app frameworks have interfaces with other OS components. Freezing the app side of the interface also constrains the other side, effectively introducing another piece of ABI that must be kept stable.

For example, CoreGraphics needs to talk to the window server. If you try to freeze some version of CoreGraphics then you require the window server to implement that particular interface forever. Now the window server's private interface has become an ABI that must preserve binary compatibility. The same is true for the spellcheck server, the pasteboard server, the address book database server, and a great many others.

···

On Jan 27, 2017, at 9:19 AM, Tino Heth <2th@gmx.de> wrote:

It would mean for Apple (and others who'd distribute compiled frameworks) to maintain several code bases of the same framework given that they would need to maintain backward compatibility and hence wouldn't be able to use new language features, etc. It's IMHO not that much about the technical constraint of having multiple binaries within the framework bundle as much as maintaining the code in a way that would compile under all Swift versions you'd like to support.

I haven't spend much time thinking about the implications, but wouldn't it be feasible to freeze old frameworks completely when a new Swift version is added?
After all, bundled code isn't updated at all, so I see no downside here — actually, it only adds flexibility that could be utilised when critical bugs are discovered.

--
Greg Parker gparker@apple.com <mailto:gparker@apple.com> Runtime Wrangler

Greg’s words from upthread are instructive. The current interface between app bundles, and the rest of the system, is currently limited to the C/Obj-C ABI. While the idea above could potentially allow you to factor common libraries out of the apps we have today, those libraries, and the apps themselves, would still be limited to the C/ObjC ABI to speak to the rest of the system. To have any prayer of being able to have apps call into more modern, and non-objc, Swift frameworks in the rest of the system, you need to have a stable ABI for swift. So the idea might work for the apps of today, but it doesn’t do much to enable the apps (and system frameworks) of tomorrow.

···

On Jan 27, 2017, at 11:27 AM, Tino Heth <2th@gmx.de> wrote:

Rather than Apple have to commit in perpetuity to ship all relevant versions of the frameworks, one could imagine more of an app-thinning/install-time optimization: “thinned” versions of apps would be built and signed without the shared system frameworks, but with dependency information recorded. At install-time, the app would be installed, and the working set of required shared frameworks on the device would be updated with any needed dependent frameworks. Thus the only the set of frameworks required by installed apps would be present on device.

This would require more app store, install-time, and perhaps dynamic linking, infrastructure, but would seem to solve the problem in a way that wouldn’t require ongoing development resources be applied to old versions.

I'm thinking in the same direction… but although this isn't trivial, it still sounds to simple to be a solution for the dreaded problem of unstable ABI, so I can't fight the feeling I'm missing something important ;-):
What is the main motivation for ABI stability?
Sure, it would reduce app size, and this could be achieved with with this approach as well — but nowadays many apps are so bloated that 10MB make no real difference.
Declaring a stable ABI is also a statement of maturity (I don't care much about that, but others might feel different), and there might be other goals I'm not even aware of, and which cannot be reached with "tricks".

On Jan 27, 2017, at 2:21 PM, Greg Parker via swift-evolution <swift-evolution@swift.org> wrote:

You can't simply freeze an old version of a framework. Many app frameworks have interfaces with other OS components. Freezing the app side of the interface also constrains the other side, effectively introducing another piece of ABI that must be kept stable.

For example, CoreGraphics needs to talk to the window server. If you try to freeze some version of CoreGraphics then you require the window server to implement that particular interface forever. Now the window server's private interface has become an ABI that must preserve binary compatibility. The same is true for the spellcheck server, the pasteboard server, the address book database server, and a great many others.

These are the benefits of ABI stability that I can see [**]:

1. Not shipping the stdlib with your app reduces its download size.
2. 3rd party developers could someday ship binary packages/frameworks.
3. Apple could choose to ship OS frameworks with “Swift only” enhancements (e.g. a method that takes an "Int?” argument).

That said, I think that these wins are often overstated and poorly understood.

On #1, as you say, there are lots of ways to reduce the impact of the Swift stdlib size in general, and a lot of the work on ABI stability pushes towards this. As it is today, the overhead is only a few MB anyway (not the 10’s of MB that people think by looking at the size with bitcode included) because of how “app thinning” works. Further, if some version of iOS “N” included the stdlib, your app would still need to include the stdlib if your app needs to deploy backwards to "iOS N-1”. There are other things that could be done that would have a huge benefit, like merge all the overlay dylibs + Swift stdlib together into one large dylib. This would improve app launch time independently of ABI stability.

On #2, as Michael’s document indicates, this requires stabilizing the ABI *and* the module file format. The second clearly isn’t in scope for Swift 4, so it isn’t really a motivator for this. Also, adding a binary-only dependency to your App is always a risk thing.

On #3, it isn’t a truly great solution, but the existing overlay mechanism has been proven as a way to ship substantial Swift-only enhancements to the OS. In Swift 3, Foundation included major enhancements including entirely new types like Data. It has additional advantage of allowing backward deployment of this functionality.

In short, in the Swift 4 / iOS "N" timeframe, the only benefit of ABI stability would be that apps don’t need to include the standard library if they deploy to iOS "N or later”. Independent of whether ABI stability is achievable, I think it is important to do things like merge the dylibs together, since backward deployment will continue to be important for many apps.

-Chris

[**] Just a note, but I don’t think that any of these apply to Linux or other non-Apple platforms supported by Swift.

···

On Jan 27, 2017, at 11:27 AM, Tino Heth via swift-evolution <swift-evolution@swift.org> wrote:

Rather than Apple have to commit in perpetuity to ship all relevant versions of the frameworks, one could imagine more of an app-thinning/install-time optimization: “thinned” versions of apps would be built and signed without the shared system frameworks, but with dependency information recorded. At install-time, the app would be installed, and the working set of required shared frameworks on the device would be updated with any needed dependent frameworks. Thus the only the set of frameworks required by installed apps would be present on device.

This would require more app store, install-time, and perhaps dynamic linking, infrastructure, but would seem to solve the problem in a way that wouldn’t require ongoing development resources be applied to old versions.

I'm thinking in the same direction… but although this isn't trivial, it still sounds to simple to be a solution for the dreaded problem of unstable ABI, so I can't fight the feeling I'm missing something important ;-):
What is the main motivation for ABI stability?
Sure, it would reduce app size, and this could be achieved with with this approach as well — but nowadays many apps are so bloated that 10MB make no real difference.

Maybe so - but IBM solved this very problem along with release to release binary compatibility for C++ and a number of other languages twenty years ago with the System Object Model (SOM).

As one of the developers of the ill-fated WWDC OpenDoc demos that failed spectacularly because of bugs in early versions of SOM, I just want to caution everyone not to underestimate the complexity of these solutions and the level of effort required to bring them up to production quality.

That being said, it is probably too constraining to assume that one ABI should last more than a few years for a language like Swift that is destined for “world domination” and will eventually be used for platforms spanning from watches (and smaller) to data centers and solutions from scripting to real-time systems. This is a great time for language design — the most prolific period of language development I have ever seen. There is no reason to expect the pace of language innovation to slow down and new ideas will probably force Swift to stand still or break ABI compatibility if a system is not developed to support ABI migration.

···

On Jan 27, 2017, at 3:08 PM, Freak Show via swift-evolution <swift-evolution@swift.org> wrote:

I'm not arguing for its adoption per se - but good ideas are always worth stealing and there was some solid engineering in there.

Sent from the road

On Jan 27, 2017, at 09:19, Tino Heth via swift-evolution <swift-evolution@swift.org> wrote:

I wouldn't expect that I can mix language and framework versions freely.

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

Maybe so - but IB M
solved this very problem along with release to release binary compatibility for C++ and a number of other languages twenty years ago with the System Object Model (SOM).

Yeah and Microsoft’s COM is a reasonable approach (and Apple ships a version of it used for plugin loading).

Unfortunately you end up with IFrob, IFrob2, IFrob3, IFrob4, IFrobEx, IFrobEx2. You also introduce a hard boundary that makes passing “native” types across the boundary impossible. Everything must fit inside the set of types described by COM. For Swift any scheme boils down to “use a lowest-common-denominator and give up all of Swift’s advanced type system features”.

Believe me, there are parts of the Simulator stack where I would like to use Swift but without ABI stability it just isn’t possible. If there were a plausible alternative I’d happily take it. There isn’t.

Russ

···

On Jan 27, 2017, at 2:08 PM, Freak Show via swift-evolution <swift-evolution@swift.org> wrote:

I'm not arguing for its adoption per se - but good ideas are always worth stealing and there was some solid engineering in there.

Sent from the road

On Jan 27, 2017, at 09:19, Tino Heth via swift-evolution <swift-evolution@swift.org> wrote:

I wouldn't expect that I can mix language and framework versions freely.

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

I have to assume that was sarcasm.

The Release to Release Binary Compatibility with SOM (http://hobbes.nmsu.edu/h-viewer.php?dir=/pub/os2/doc&file=R2R_SOM.zip\) paper includes the following footnote:

"We exclude Microsoft’s COM [14] because it is an interface model, not an object model and it’s ABI forbids subclassing between library and application. If our analysis technique is applied to COM, one sees that it supports only Transformations 0 to 4, which places it in the category of procedural pro- gramming rather than object-oriented programming. "

···

On Jan 28, 2017, at 23:18, Russ Bishop <xenadu@gmail.com> wrote:

Yeah and Microsoft’s COM is a reasonable approach