Hi List
I'm writing a simple voxel algorithm in Swift that runs on the CPU. The performance with debug builds is pretty poor, but it absolutely flies in release builds, which makes perfect sense.
What I don't get, however, is differences in rendering. If all the math is floating point, there is no difference, but if the math is done in integer (which gives 40% perf improvement) then the debug vs release rendering looks quite different.
The integer renderer basically works by scaling all the trig etc up by 2^16, and then doing shifts to get back to texture coordinates. In debug, everything looks absolutely flawless, but in release I get all sorts of artefacts, some of which lead to crashes as values that should be limited are suddenly out of bounds.
Are there some likely candidates for why the behaviour is so different, or any tools or guides I can use to find out where my math is going wrong?
Thank you!