Unexpected differences in behaviour between debug and release

Hi List
I'm writing a simple voxel algorithm in Swift that runs on the CPU. The performance with debug builds is pretty poor, but it absolutely flies in release builds, which makes perfect sense.

What I don't get, however, is differences in rendering. If all the math is floating point, there is no difference, but if the math is done in integer (which gives 40% perf improvement) then the debug vs release rendering looks quite different.

The integer renderer basically works by scaling all the trig etc up by 2^16, and then doing shifts to get back to texture coordinates. In debug, everything looks absolutely flawless, but in release I get all sorts of artefacts, some of which lead to crashes as values that should be limited are suddenly out of bounds.

Are there some likely candidates for why the behaviour is so different, or any tools or guides I can use to find out where my math is going wrong?

Thank you!

1 Like

I would not expect integer or floating-point math to behave differently in debug vs release builds. Can you share your code? Does it reproduce in a stripped-down example?

2 Likes

Yeah my impression was that the math should work out the same regardless. I will try to pare down a smaller example that shows the problem.