Rendering in Swift

I released a new version of my renderer written in Swift and C++.

TLDR: Render Moana 20 times faster!

This is v0.1.0 of the Gonzales renderer.

  • Thanks to Embree (and other optimizations) rendering is 20 times faster now. Instead of taking 26 hours like in 2021 gonzales finished after 78 minutes.
  • All scenes from Bitterli’s resources can be rendered now.
  • A power-based light sampler. This is especially important for the spaceship as there are lots of tiny lights in the screen in the cockpit.
  • A Debian package for ptex.
  • Reworked dependencies on Embree, OpenImageIO and Ptex relying on existing Debian packages.
  • Lots of new materials like CoatedDiffuse, Conductor, Dielectric and Hair to be able to render scenes for PBRTv4.
  • Adding volume integration (e.g. the Volumetric Caustic from Bitterli).
  • OpenImageIO for most of texturing, caching and image writing.

There are still lots of things to do:

  • Ray differentials.
  • Parallel parsing (shouldn’t be too difficult using Swift’s structured concurrency for PBRT’s Import statement).
  • Getting all PBRTv4 scenes to work (most of them should work, the rest should be relatively easy).
  • Bump and displacement mapping.
  • Memory usage goes up when using Embree. I have yet to investigate this. Also mapping materials etc. for Embree is done very quick’n’dirty.
  • A denoiser.
  • Better sampling (low-discrepancy, blue noise, progressive multi-jittered)
  • GPU rendering; this will be probably next on my plate.

You can read about it in my blog:

17 Likes

Awesome!

I just took a very brief look at the code, and it seems that when rendering, you create a DispatchGroup and submit the tile-rendering jobs to the global (concurrent) dispatch queue.

The design is quite nice and straightforward, and it seems at first glance like it might not be too disruptive to port to Swift concurrency. You could add the tile-rendering jobs to a TaskGroup and merge the results as they become available, and - perhaps more interesting IMO - you could batch the rendering jobs using async let. I've heard that one of the benefits of Swift concurrency is that when a function uses async let, it has a known "amount" of structured concurrency (e.g. 8 structured child tasks), and the compiler can optimise how it allocates the stack that each child task requires, but I think you need some really heavy workloads (like this) to see the benefit.

Have you started experimenting with anything like that? If/when you do, please do post some results and don't hesitate to ask questions. I think lots of us here would be very interested to see how Swift concurrency scales in a project like this.

7 Likes

The tile rendering predates Swift's structured concurrency. I'd love to convert it to async let and I think it should be only a modest change. But I also have lots of other things on my plate so let's see when I get to this. It might be also a good beginner's task.

There is also the Import statement which currently only calls include at func import to (sequentially) read scene files. Import can be used to parallelize reading of huge scene files, see Matt Pharr's blog. Currently Moana takes around 17 minutes to load, I believe this can be cut down to under 2 minutes.

3 Likes