I want to start a discussion about overall design of a complex number system in Swift.
There is a strict mathematic hierarchy for the following number types:
Natural < Integer < Rational < Real < Complex
where I'm using the < symbol as subset.
These number systems support the following operations that stay within their own system:
Natural: Addition, Multiplication
Integer: Addition, Multiplication, Subtraction
Rational: Addition, Multiplication, Subtraction, Division
Real: Addition, Multiplication, Subtraction, Division
Complex: Addition, Multiplication, Subtraction, Division, Exponentiation, Logarithm
It's worth noting the following as well,
Imaginary: Addition, Subtraction
Positive Real: Addition, Multiplication, Division, Exponentiation, Logarithm
In Swift, we have the following protocols, which roughly correspond to these types,
UnsignedInteger
-> Natural
SignedInteger
-> Integer
FloatingPoint
-> Real
However, these protocols have three (?) inconsistencies with their mathematical equivalences as currently implemented in Swift:
- Subtraction is defined for
UnsignedInteger
. - Division is defined for ```SignedInteger`` (as Joe Groff pointed out, this could have been called the quotient operator, like in Python, and overloaded to // instead of / which would have avoided this as an inconsistency)
- Exponentiation is defined for
FloatingPoint
in the form of squareRoot (e.g., exponent of 1/2).
Inconsistency #1 isn't a huge deal, because one has to explicitly request an unsigned integer, so you kind of know what you're getting in to. According the discussion here, #2 is broken forever, which, IMO, sucks because it's easy to end up with Integer types without knowing it. So the big question with regards to how Swift handles numbers, is whether or not #3 is broken forever???
In other words, what's the consequence of not guaranteeing that the .squareRoot() return another Real? Is it okay to design a Complex number system in Swift where squareRoot of a Real type returns a Complex type? You can see that this exactly parallels #2, which is why I brought #2 up yesterday.
The consequence that I think would probably be the biggest, is that Complex numbers are not comparable. Real number and Imaginary numbers are (with themselves), but a Complex number, in general, is not. So if we change sqrt() to return a Complex type of some sort will not always be true that sqrt(a) < sqrt(b)
is a valid operation.
If it is true that we can't change #3, then we have to design a Complex number system that lives on its own, separate from the current numbers. It's hard to see how this won't start feel "tacked on", because not even sqrt(-1)
(eh-hem, sqrt(-1.)
) will map to something sensible.
Before digging deeper into specifics, I wanted to see if others have thought about this, and what other issues might arise.
Edit: Changing BinaryInteger
to UnsignedInteger