Is there a reason we can't install multiple toolchains on windows simultaneously?

I'm trying to find the needle in the haystack that is a toolchain version that will successfully compile our project at the moment, and the process is made a bit slower by the fact I have to uninstall one toolchain before installing the next. It appears that the VSCode extension supports switching toolchains. Is there a technical reason this is not supported on win?

mike

Yes, the reason is that there is no ABI stability for Windows yet, and the swift runtime is currently unversioned in the DLL name. The combination of these two things means that we can only have a single runtime available globally (packaging the runtime a second time would further increase the already massive tool chain) as any mismatch is not safe.

A second issue would exist beyond that - we need to consider adding a xcode-select like tool to switch between versions. Building and distributing such a tool with the swift.org distribution might be a bit of a challenge (I had started work on such a tool and am happy to revive that).

In theory I suppose that it might be possible to adjust the environment at the same time, especially now that the toolchain has reduced its dependency on environment variables (Path and SDKROOT are the ones currently that are critical). I wonder what the impact would be on the installer to ensure that repair still works properly.

The distribution is built with the idea of multiple parallel installations though, it’s more about the fine details to make it practical.

1 Like

As symlinks have to be “activated” anyway when compiling on Windows, couldn’t they be used for easy switching? (This is a common method to switch between Java versions on Linux.)

Yes, the symlinks need to be setup as well, but that requires a good deal of C++ code to be added to the installer (and further complicates the installer) and would limit what the installer can do (i.e. it would no longer be able to uninstall Swift fully). These should be managed external to the installer IMO. As such, symlinks wouldn't be the solution on their own.

Furthermore, symlinks are recommended to be enabled, but they only are required for some use cases (e.g. building/testing Swift itself; SPM based builds should work without symlinks enabled, it just results in a warning).

Installing via WinGet, I actually had Swift 5.10 and Swift 6 installed at the same time. I had to uninstall the prior, because swift --version would pick up 5 instead of 6.

WinGet does install in the user folder, though, and the runtime is not available globally. That means that when using a binary, the runtime has to be copied alongside it (2 dlls for Swift 5, 14 for 6). I'm hoping that eventually, static linking will remove that necessity.

Static linking also comes with size costs. In fact, the plan is to more aggressively adopt dynamic linking (e.g. make LLVM and clang DLLs that can be shared rather than statically link). This will also hopefully enable us to do more aggressive things like LTO and eventually PGO/LTO and even post-link optimizations.

Size wouldn't be an issue if static linking is combined with tree shaking. The average program uses a tiny fraction of the standard library.

Dynamic linking also has other disadvantages besides dependency hell:

  1. It makes a program needlessly scream "I was made in Swift". Not that there's a reason to hide it, but there's no reason to leak it either.
  2. Licensing is unclear (I actually wanted to make a separate thread about this). LLVM runtime exception to the Apache license applies very clearly when statically linked. But the case is not clear for dynamic linking: Do the extra DLLs need to be bundled with a copy of the Apache license?
  3. If the product is a DLL, rather than a program, and it's placed in a folder where another program scans for DLLs to load (Simple plugin system), the dependencies would come up in the scan, and not just the intended DLL.
  4. Figuring out which DLLs you need to bundle with your program is a tedious process in and of itself.
  5. Single-executable portable programs would be right out the window. An example for such a program in wide use would be Godot engine. One executable, no installation, and it Just Works™. This would be impossible if the language itself enforces bundled DLLs.

Static linking just seems like a better approach. With enough effort (Read: Using a custom C runtime, or creating a pure Swift runtime using Swift embedded and import libs) it's even possible to remove dependency on any non-kernel DLLs.

DCE is not possible/insufficient. The current static linking does mandate DCE and even with that, note that the latest update to LLVM increased the binary size for clangd by ~100% (~200MiB) .Resolving this requires changing the structure of the library to allow the trimming.

Additionally, on Swift, the VWT are preserved unconditionally, which become roots for the symbol search.

I think that for anyone interested in that will determine that irrespective of the linking.

IANAL - you should consult a laywer for advise on how to handle licensing.

A static library is not possible to use as a plugin in the first place. So this seems disingenuous. The dependencies of the binary are going to be needed - and static linking is a problem actually for that scenario - you can only have a single runtime instance in the address space, or you need to ensure that you do not use the language across the module boundary. Dynamic linking avoids this (and is partially why python is dynamically linked in LLDB).

There's a difference between being visible to forensic tools, and screaming at anyone deploying/extracting the files from a zip.

As a maintainer, you're one of the rights holders, though. Perhaps not the person who decided on the use of Apache with Exception, but still a lot closer to that person than I am. Licensing is something the licensor needs to consider when distributing a product, not just the licensee.

The worst case scenario is to enter a legal limbo like what Rust has now, where every Hello World program is required to include a copy of the Expat license crediting the Rust foundation, but doing so could violate the Rust trademark.

I'm not sure what you're trying to say here.

Suppose I have program A with a plugin system. When program A starts up, it scans C:\Program Files\A\Plugins for files ending with .dll. Any such file it finds, it performs LoadLibrary on it, followed by GetProcAddr(handle, "LoadPlugin"), and runs the returned function.

With static linking, you can create a myplugin.dll file, deploy it as C:\Program Files\A\Plugins\myplugin.dll, and it will Just Work™.

With dynamic linking, you have to deploy C:\Program Files\A\Plugins\myplugin.dll, as well as C:\Program Files\A\Plugins\FoundationEssentials.dll, C:\Program Files\A\Plugins\swift_Concurrency.dll, C:\Program Files\A\Plugins\swiftCore.dll, and a dozen other DLLs. When A starts up, it will load myplugin.dll successfully, but also attempt to load swiftCore.dll, swiftCRT.dll, swiftWinSDK.dll, etc, failing in the process (since none of them export LoadPlugin).

Also, if you have myplugin1.dll and myplugin2.dll, in the dynamic linking case they need to share dependencies like swiftWinSDK.dll, hoping they both use a compatible version. With static linking, that's not an issue - They each have their own private copy of swiftWinSDK statically linked into them. This does not cause a collision - Different DLLs are allowed to have the same code, with the same functions, and even export functions with the same name (Although they shouldn't be exporting imported libraries anyway), without any collision.

So, if anything, statically linking reduces collisions in this case. It doesn't increase them.

That's an implementation detail, which can change in the future. It's technically possible to tree shake VWTs if either the associated value type or protocol are not in use in any function that was kept.

I do understand that anything more complex, like figuring out if a used value type is ever cast to a used protocol, could be more of a problem, since such a cast can be hidden by going through UnsafeRawPointer, and/or using unsafeBitCast. But even so, there are ways to trim even while staying optimistic on usage. It would still be smaller than tossing in the entire standard library as a DLL with not a single byte removed.

The size is still quite large - this was attempted in the past, and even with some tricks to do this more aggressively with ELF, we ended up with a much higher floor than was desirable. It did help to a certain extent, but not enough to really fully offset a large number of binaries with Swift.

What about offering it as a flag, then?

"Offset a large number of binaries" assumes the binaries can share the runtime DLLs in the first place. In Windows, this doesn't seem to be the case at all. Right now what I'm seeing is a need to deliver a copy of several DLLs with each binary, which is necessarily more space than just having a large (but trimmed) binary.

I acknowledge that in some cases, it might be possible to install the runtime DLLs into a central location that several binaries can share. But this is not universal. For portable applications, or applications installed without admin privileges (as WinGet can often do), this is certainly not the case.

To accommodate for such cases, and any other cases where having the DLLs be separately bundled is not suitable, there should be a flag to allow static linking the runtime.

It cannot be a flag. It would require a completely separate build, which would actually be an even longer and larger build (due to the static linking). We have already explored trying to do LTO with that and even a partial static linking seems to be somewhat intractable. So overall, these builds are larger, slower, and missing potential optimizations.

Additionally, even if you overcome that and build them, packaging it would increase the size of the installer which becomes an issue (the bandwidth is not free).

It actually is possible - the toolchain already does this. All the various binaries are using the same runtime. You do need to deliver a large set of content outside of your executable currently, though of that ~45M, ~25M is ICU data which is statically linked into Foundation.dll. The current distribution has been tweaked to minimise the number of redistributed modules while not increasing the size of the distribution.

Again, the toolchain is an example of this. Yes, there is no ABI stability on Windows currently which means that you need to have the same exact version of the runtime that you built against. To aid with that, we do distribute a MSM with the toolchain to allow you to re-distribute the dependencies easily. The DLLs could be installed alongside with your application preventing the administrative install. In fact, the latest releases of the toolchain have moved away from an administrative install to a per-user install. Ideally we would recover the ability to select the installation, but that is not currently possible due to the how the MSI framework works.

It is not just a matter of a flag - that already exists. It is not possible to build the runtime statically currently due to limitations in the compiler. Addressing those is a preliminary requirement to even evaluating the option. Even if the static linking was permitted, note that it would be limiting because currently the DLL storage is conflated with the access control (i.e. public is equivalent to __declspec(dllexport) when a module is built dynamically).

Again, you seem to assume "installer" which tosses portable apps right out the window. Portable apps are a good thing. It shouldn't be a case of "If you want a portable app, use a different language".

It's a very special case of one app multiple binaries. Applications that are not toolchains usually have a single binary. While the same application can share the runtime across multiple binaries, the same does not apply between different applications.

And doing so would have a space requirement equal or greater than that of a statically linked binary. Download requirement too, since you can't skip downloading those DLLs. If every app bundles its own version of the DLLs, the only place you get any save is the rare case of one app multiple binaries. Which, again, is the exception, not the rule.

Would be interested in testing that. Although the rest of your statements make me assume that doesn't do what I expect it to do.

I wasn't expecting it to happen tomorrow. I just want to know it's on the roadmap. There are limitations in the compiler. They should be addressed.

Which is another issue I mentioned, and probably needs addressing. I tried linking with a .def file, which I assumed would discard all exports save for those specified in the .def file. But that didn't happen, so I assume something internal to the compiled code is telling the linker to force the export regardless of what the .def file says. My hope is that correcting this is on the roadmap as well.

Sorry, that was my mistake. I should have been clearer: by installer I mean the toolchain installer. That is my primary concern - application distributions are an easier task. At least for modern applications, you need to have more than a single file for distribution. You can extract the MSM and create the zip for a non-installer distribution. But from our experience at The Browser Company, the MSM does work well enough.

Different applications can also share the same binary if the binary is in a shared location. The restriction due to the lack of ABI stability is that the version must be exactly the same. This is why the Win32 SxS is interesting - it would enable the SxS installation of multiple versions and allow sharing across all of the system.

Correct, in this case you are paying roughly the same cost. This is why the system-wide installation is desirable :slight_smile:

-static-stdlib - and you would be correct - it doesn't do what you expect.

Sure, it is on the roadmap, but there is a ton of stuff on the roadmap and not enough contributors. However, note that you would need a fully hermetically sealed library to support static linking - and you cannot use Swift in any of the public entry points. Everything exposed from the statically linked module must be a C interface or you run the risk of ABI constraint violations.

Sure, this is something that I would like to see, but this is far far down the priority list for the Windows port IMO. Others in the community who are interested in the feature are welcome to propose something through the evolution process.

I disagree with that assertion. I use my fair share of applications that are just a single executable "Put me anywhere and run me". I already gave Godot Engine as an example. You can't accuse it of not being modern.

Any guide for that? It doesn't sound much simpler than just copying the contents of /Runtimes/<version>/usr/bin to the application folder.

Tried adding it to linkerSettings: [ .unsafeFlags. Got lld-link: error: could not open 'Foundation.lib': no such file or directory. This was indeed not what I expected. But it's interesting to think that if those files existed, it would have worked.

You need admin permissions in order to be able to write to any such "shared location". One issue I initially had when making a DLL is that it wasn't working if copied outside the project folder. It took me a while to figure out the program loading it was unable to find the runtime DLLs from Swift's local installation (Even though it's technically in the PATH).

This part I don't get. Why can't the standard library be linked the same way you'd link a static library written in Swift? That is Package( products: [ .library( type: .static,. As far as I know, that sort of setup doesn't produce a DLL.

I have projects where portable versions (i.e. without any installer) for Windows is a must. Not having to add a bunch of DLLs in those cases would be great. There are also smaller tools should be easily distributed as a single exe file. So:

  • Having the option of static linking is a good thing. Do not have to use it when it does not fit.
  • Reducing binary sizes with LTO, tree shaking or whatever is almost a must in those cases.
  • All licenses should allow that. E.g. Swift‘s license already has the Runtime Library Exception, should be good. Those things should be checked, but I do not see a problem here.

I should've been clearer: modern Windows applications. There are enough version specific quirks with WinUI that you need to distribute at least WinUI with your application.

The MSM is installed with the toolchain (unless you choose to not install it). You can extract it using dark -x rtl.[arch]msm.

Because the code generation for the standard library requires special handling due to the special known symbols being homed there. Just because the standard library is written primarily in Swift does not mean that it is a standard Swift module.

I'm running Godot on Windows. I wouldn't be in this channel if I was a Linux user.

I suppose maybe you mean "Applications that don't abuse Vulkan/OpenGL" or "Applications that don't use QT". Or maybe just "Applications built with MSVC and not MinGW/LLVM". That's not "Modern", though, that's just toolchain specific. MinGW gets just as many updates, and many applications are built with it, including Windows-exclusive ones.

Or maybe you just mean an application built on top of WinRT instead of Win32? But there are clean and direct ways to do that too. It doesn't have to involve extra dependencies not provided by the base OS.

I can see the MSM packages. What I don't see is a dark.exe. And the more commonly available tool, 7-Zip, is not giving useful results.

Sure, but the experience at The Browser Company has been that there are plenty of version specific issues that you really do need to package and ship a copy of WinUI to get a reasonable release.

dark is part of the WiX toolset.

I'd love to chat about this when we're all post-festive—perhaps we can improve the situation (somehow.) I don't have any specific ideas in mind, but I'll ping you in DMs when the turkey and stuffing subside.

1 Like