I am using MetalKit and Cocoa to render my game, but because terrain is procedurally generated and mutable I want to use concurrency to recalculate the meshes in the background so the game doesn't freeze every time something changes or more of the world is generated.
I have strict concurrency enabled which gives me warnings for everything like this:
let mouse = parent.window.mouseLocationOutsideOfEventStream
Which warns me that I can't access the MainActor isolated Cocoa api from the MTKViewDelegate (and it's definitely correct as ignoring the warnings and using concurrency anywhere in my program results in not even the window appearing).
But I do need access to things like mouse position at the exact time of rendering for example for the software UI renderer to draw the cursor.
How would I bridge metal code with a concurrent Swift program? Are there any relevant WWDC sessions I should see (I didn't find any )?
Not particular to your question, but typically you wouldn't want to "access mouse position at the exact time of rendering"; you would read out all relevant events / keystrokes / etc. at the beginning of your frame (say, into a Sendable struct) and then consider them immutable for the duration of the frame.
This obviously not only helps with concurrency, but is also a general architectural pattern to have predictable buffers of per-frame data to ensure that all rendering is consistently done using the same information.
That is actually something I'm doing, I meant time of rendering to be func draw(in view: MTKView) of the MTKViewDelegate
the game is already structured to be platform independent which implies getting input in a platform independent struct, so it eventually makes its way to the software UI like so:
Apple haven't updated MetalKit / MTKViewDelegate for concurrency yet.
I can't explain the not even the window appearing part, but if it helps, the workaround I use in my renderer class which is marked @MainActor and implements MTKViewDelegate is:
Building for production...
error: compile command failed due to signal 6 (use -v to see invocation)
Assertion failed: (SGF.ExpectedExecutor || SGF.unsafelyInheritsExecutor()), function emit, file SILGenConcurrency.cpp, line 650.
Nevermind that was not what crashed it
Ok I think the latest toolchain is just compiling my code into nonsense (again)
It should be closer to 200MB
It would be really useful if Xcode could launch the memory debugger without rebooting with SIP disabled
It's leaking memory so badly macOS started stuttering
Nevermind I forgot I'm not yet removing invisible faces so I was accidentally trying to allocate 1 610 612 736 vertices It's still using more memory on classes/actors without the vertices vs when I was manually managing it in Zig (with vertices) but it's not bad
After solving that issue I could try MainActor.assumeIsolated and is not working for me sadly, I see the blank window without the clear color or my UI on top
While it's probably not the best approach - something like pre-fetching the necessary info at the start of each frame, as @nkbelov suggests, is probably better - but for edification you can do:
let mouse = DispatchQueue.main.sync {
parent.window.mouseLocationOutsideOfEventStream
}
The really big caveat with that is that it blocks the current thread until the main thread responds. If the main thread is already doing something, that could be a while. So you risk serialising your code.
It's certainly possible to make well-behaved programs that use this sort of thing, but it requires discipline and care to basically keep the main thread idle all the time, so that it can serve these syncs very quickly.
I think I understand what you're trying to do, but, assuming that draw(in view: MTKView) is indeed called from somewhere which isn't the main thread (i.e. if MainActor.assumeIsolated happens to crash there; can't check this myself at the moment) — or you really want to respect concurrency semantics — then I'd suggest you think about the event ordering the following way (FWIW this also applies even if it's happening on the main thread / actor, which being a method on a view it probably should):
At the time when the loop calls your draw(in:), it's already kind of irrelevant to gather user input (which includes the cursor position). Remember that rendering triple-buffers, meaning that the actual cursor position which changes during this call (maybe your user just so happens to move the mouse while you're assembling your GPU calls) will lag behind at least this one frame. Screen refreshes are fast enough that you don't have to be this instantaneous with handling the input; the proper place for the new pointer coordinates is the frame that comes after, and there you have ample space to read out this property from the main thread.
Or, correct me if I'm wrong and somehow misunderstand the particular case of mouse pointers.
Just to add, if the Metal call actually happens to run on the main thread, and the compiler's complaint is simply because it wasn't annotated @MainActor, then DispatchQueue.main.sync will deadlock, so @Lua may want to keep this in mind too.
I would like to do that, but Cocoa makes control flow incomprehensible, I have no idea what it's going to call and when and so I don't know how to structure my code correctly.
And it seems to change every few macOS updates in ways that subtly break code for me.
My code is otherwise very structured and deterministic, and most of my functions are pure.
I barely found any documentation on creating a windowed app without Xcode, this is what I have now:
static func main() {
let instance = Self()
let delegate = AppDelegate(game: instance)
let app = NSApplication.shared
app.delegate = delegate
app.setActivationPolicy(.regular)
app.run()
}
Once I call run() I have no idea where anything is running. This is not what I want, could I run Apple's classes on the side without giving Cocoa control over my program?
This is why I put everything in draw(in view: MTKView), it was the most obvious way to have my code run reliably every frame.
If you are writing macOS app, despite Xcode being… how to put it… controversial IDE in overall, I’d suggest to use it in such cases, it brings more benefits to the table.
If the app didn’t crash, then we can safely assume that delegate is called on the main actor, therefore issue not in delegate call, but in implementation of drawing itself.
If there was a crash, then delegate wasn’t called on the main actor, and you can safely use synchronious dispatch here to the main queue. But I would be a bit surprised if delegate of view isn’t called on main thread.
I am, what I meant was my code is just a Swift package and I'm creating the .app myself.
But I might switch to something else, I was just experiencing editing a 100 line file with 1 second input delay (not exaggerating) on an M2 Max.
It didn't crash, but when I used any actors in my program other than MainActor (even not doing anything) draw(in view: MTKView) was actually never called in the first place. Xcode said I have +infinity frames per second and never hit breakpoints in the delegate functions.
I struggle to understand your experience with Metal, but seems like you are new to it, so I would go with default templates in the first place.
(Totally understand your pain on editing nightmare in Xcode, but either that, or googling/remembering tons of APIs because make autosuggestions work longer than 10 minutes outside of Xcode for Apple's SDKs I've found impossible).
There is too many unclear details to understand what's happening. You might not setting up flow properly, for example. Or as I've said the drawing implementation might be incorrect. If you can provide more details, the discussion might be more helpful.
I am new to Metal, but it is infinitely easier than OpenGL. 90% of my issues come from the forced object oriented structure of Cocoa and all the delegates, I don't like it and how spaghetti it feels to initialize anything.
The last thing I want is to know even less about what my code is doing, use storyboards and have less type safe resource bundling
An idea: install a 1/60 or 1/120 sec timer (or even better: a display link callback) that grabs the current mouse location and remembers it in some common variable, and when you want to use the current mouse location from a secondary thread use that variable instead of calling window.mouseLocationOutsideOfEventStream. A more optimal variation of this method – subscribe to a "mouseMoved" event. In all cases the variable must be read/write protected, e.g. with a mutex, or you could store x/y coordinates into an atomic UInt64 variable (32 bits per coordinate should be more than enough to represent exact mouse position on the screen). Not sure which is preferable in this case, a mutex or an atomic.
There's also a CGEvent(source: nil)!.location route, which AFAIK is secondary thread safe (I could be mistaken), but its result are in global coordinates.
I'd reuse template from Xcode for Metal app in that case. I suspect (haven't set up macOS windows from ground up like never) your view simply not being rendered... Because MainActor.assumeIsolated works just fine in fact on this method.
I went around Cocoa completely and made my own window and CGContext using CoreGraphicsServices.
That api is not public so it will definitely break one day but it's a fun experiment
It uses so much less memory than Cocoa and starts up basically instantly, Cocoa is definitely adding a lot of overhead.
At least now I can be 100% sure what's running on the MainActor