Very helpful post, thanks Steve.

I want to unpack this idea a bit more:

I could imagine a Complex number system that distinguishes between a Real type (with no imaginary component) and an Imaginary type (with no real component). Taking the square root of a Real number could result in either a purely Real type or a purely Imaginary type. You’d never end up a fully Complex number that includes both Real and Imaginary.

So, for 99% of the existing code, the return of .squareRoot would work exactly as expected—it would just be returning a Real type.

From a programmers perspective, the programmer is either 1) already certain they’re feeding the square root function a positive number or 2) checking that they didn’t get NaN out the other side. But, if the programmer can reason about this, why can’t the compiler?

To be more specific, if the programmer is already checking to see if `a>0`

or taking `abs(a)`

before taking the square root, then we can create a system that promises you’ll get a Real type back.

The advantage to this is that if the programmer is not checking `a>0`

or taking `abs(a)`

, then it very well may be a bug—and it’d be really helpful if the return type is ambiguous.