Rebuild times in new shell significantly slower

(cross-posted from Using Swift)

I'm seeing some behavior I don't understand in rebuilding / incremental debug builds when they're launched in different instances of the same shell. (bash)

Platforms tested on: Ubuntu 20.04, Windows + WSL + Ubuntu 22.04
Swift 5.9

Here is an example on vanilla Ubuntu 20.04, with no file modifications in between builds:

dkolas@bernard:~/projects/project-name$ swift build --build-tests
Building for debugging...
Build complete! (71.57s)
dkolas@bernard:~/projects/project-name$ swift build --build-tests
Building for debugging...
Build complete! (1.69s)
dkolas@bernard:~/projects/project-name$ bash
dkolas@bernard:~/projects/project-name$ swift build --build-tests
Building for debugging...
Build complete! (72.52s)

Note that in the first and third builds, the line 'building for debugging' does not appear until 70s into the process, if that means something to anyone.

I don't understand why a rebuild in a new shell, with the same user, should behave any differently than a rebuild in a shell in which I've already built.

Things that don't seem like they're the issue:

  • files / directories not in filesystem cache. Using vmtouch, it appears that the whole project directory is more or less still in memory in a new shell.
  • Dependency resolution - this generally creates terminal output, so I assume it's not happening in either case, and the same behavior can be seen when passing the --skip-update flag in the new shell

What I'd like to understand is:

  • What is the step that's being silently done (or what's being cached the first time) such that the first rebuild in a new shell takes so much longer than the second?
  • Is there a way to reuse the cache, or pass some flag to the build, such that this can be avoided?

The reason this is important to me is because VSCode on Windows + WSL appears to be effectively launching a new shell every time the build is run, so I'm paying that much longer cost for 100% of incremental rebuilds. On my main workstation, modifying one test file and rebuilding in the short case takes 1s, versus 30s in the long case, so it's a significant change in experience.

(see Rebuild times consistently take quite long in Build Task · Issue #625 · swift-server/vscode-swift · GitHub)


This is because SwiftPM's manifest cache is based on the shell environment (since package manifests can and do change results based on environment variables), so if the environment changes, they'll get re-evaluated which is a costly process.

You should be able to observe this behavior in verbose mode, in non-verbose mode, there's only output for the build process itself and no other steps.

We are tentatively thinking about applying a filter to the shell environment that we expose to package manifests in order to filter variables which are often changing and should be irrelevant to manifests. Long term, we should offer a different way for optional behaviors, so that we could potentially not forward the environment at all which would also solve this problem.


Wow, thank you so much for your answer! I was starting to go a little nuts, deep in the bash man pages.

I'm sure this won't surprise you, but I can replicate / verify that by:

  • starting a new shell;
  • forcing the env vars to be EXACTLY what they were in the old shell;
  • rebuilding from there.

I think this may give me a viable workaround for my VSCode / WSL scenario, by figuring out what env vars are changing between build runs and forcing them to be the same before running swift build --build-tests.

Filtering the variables with an allow or disallow list seems like it might help a lot. In my test case from command line, it was SHLVL and PATH that were changing, but those seem unlikley to be the ones affecting the situation in VSCode / WSL.

Will report back when I know which ones specifically are causing the issue in that context.

Let me know if there's anything else I can do to help!


The culprit in the VSCode / WSL context was the env var:


and sure enough, if I set it to a fixed value before build, rebuilds are fast.

If you do decide to do a filter list for env vars, I would advocate for that being one of them :slight_smile:

Also, one other note... I tried doing verbose output from the package manager as I was trying to see what was different, and I never saw anything about invalidation of the manifest cache. I'm not sure if that's because I'm using a release version?

Thanks again for the help!


If that is the case can you add VSCODE_IPC_HOOK_CLI to this list.
Is there a related SwiftPM issue. I can add this to?

cc @Max_Desiatov

For posterity, this has been solved in recent development snapshots and the relevant VS Code issue on GitHub was closed: Rebuild times consistently take quite long in Build Task · Issue #625 · swift-server/vscode-swift · GitHub


Hey, found this thread because, for what it's worth this is causing trouble with build plugins as well.

Do swift run in one shell window and it is as happy as a clam for as long as you want. Open a new window and it will rebuild even though it's pointing at the same build folder. Go back to the original window and it will now rebuild, too. I have a screen capture of this happening in Terminal but I can't upload it.

As mentioned VSCode spins up processes at the drop of a hat so it's been making working in VSCode with build plugins a little tricksy. Meanwhile totally smooth over in Xcode.

Would the current snapshots have fixed that as well?

If new windows and/or terminal sessions in VS Code set new environment variables and those variables are in this list swift-package-manager/Sources/Basics/EnvironmentVariables.swift at 0416a2805df975018bda111606fab1f736e8e2db · apple/swift-package-manager · GitHub, then new snapshots fix that.

1 Like

This is in Terminal, Terminal (Still 5.9)


I'm sorry I don't know which variable might be triggering this. On these fresh windows it looks like the only things that are different is TERM_SESSION_ID. Look like it will fix it, for this case as well.

Thank you!