I meant, since so many processors already support it, Swift should be able to apply simd to vector types. So you could have
static func ++ (a:Vec4, b:Vec4) -> Vec4
which would evaluate a vectorized addition. More complicated stuff like a vector/matrix math library (cross product, matrix multiplication, etc) would be built on top of those floating point APIs.
That's not quite what I'm suggesting. I'm advocating for (essentially) two additional types on top of the existing Double type: an Imaginary type and a Complex type. (I'm roughly thinking of Double and Real as being the same thing for the moment.) The idea here is that Real and Imaginary are both backed by a single floating point value, while the Complex type is a structure containing a Real and Imaginary value.
The advantage to having an Imaginary type separate from Complex type is two-fold:
fewer floating point operations, and
smaller memory footprint.
This may also have another advantage. There are two-standard data structures for storing complex types that mimic DSPComplex (as mentioned by Howard Lovatt) and DSPSplitComplex. In the first case, the real part and imaginary part are interleaved, with a stride of 1 separating them. In the second case, the there is a pointer to the real array and a pointer the imaginary array. So now, if you want to mimic the equivalent of DSPComplex and DSPSplitComplex with Swift, you'd do
let interleavedArray : Array<Complex> = [Complex(1.0,1.0),Complex(2.0,1.0)]
let realArray : Array<Real> = [Real(1.0),Real(2.0)]
let imagArray : Array<Imaginary> = [Imaginary(1.0),Imaginary(1.0)]
So this has the advantage of reproducing both standard data structures, and not losing any information about the underlying type, by which I mean the array of Imaginary numbers has the right data structure for use in these algorithms, but also behaves correctly mathematically. I don't think you can achieve this if you only have a Complex type.
My earlier idea was that the squareRoot function on Real numbers would then only return a Real or an Imaginary---never a Complex. Thus, any current check to .isNaN can simply be replaced by .isImaginary.
I don't understand the implications of this---would this make things worse?
Could you explain your reasoning for having a single generic Complex<T> type, rather than separate types like Complex32 and Complex64 similar to the standard libraryâs integer types?
As a pure Swift type, ultimately the distinction should be without practical difference to end users once the compiler acquires enough smarts. The standard library uses Python to generate redundant code; as part of a third-party library, Complex is relying on the Swift compiler to perform that task when methods are specialized.
Having Complex be generic over the underlying real type seems âobviously correctâ to me (except possibly for the wildly-abstracted protocol tower alternative). Why would you want to repeat yourself?
Imaginaries, unlike Reals, aren't closed under multiplication nor division. Adding a separate type for them seems ill-advised. I think your model is fine as-is.
Thereâs some minor gotchas if you donât have an imaginary type. In particular, one would like to have a named imaginary unit; letâs call it âComplex.Iâ as a placeholder. If you donât have an imaginary type, â.Iâ is âComplex(0,1)â. Multiplication by â.Iâ should be a rotation that just swaps real and imag, negating one of them. But if we compute â.I * Complex(infinity, 0)â, we get â(nan,nan)â instead of the expected â(0, infinity)â.
This isnât the end of the world, but itâs a mild annoyance, especially when using expressions like real + Complex.I * imag. There are some other ways to deal with it, but having an imaginary type is one of the nicer options.
But ain't that a limitation put in place by having NaN itself? If we're gonna operate with Complex numbers in some context, we should be handling multiplication by infinity and things like sqrt(-1) such that they don't spawn NaNs. I don't quite get why having Imaginaries in place would eliminate these kinds of issues
Of course not. But you can change the semantics of NaN when in the context of your Complex numbers. (infinity, x) * (y, z) doesn't need to be (NaN, NaN).
If you can do it well by hand, there's no reason why you couldn't do it just as well in a program.
If you want to have branches in your complex multiplication, yes. You will have a lot of upset users when they try to get Julia/Matlab/etc-level performance out of that, though.
Now, I notice the generic type parameter is constrained to Real, which refines FloatingPoint. However, if it were instead constrained to SignedNumeric then one could model the Gaussian integers as Complex<Int>.
Was this use-case considered, and if so what was the reasoning behind the decision to require FloatingPoint?
I find it strange that we have so much active discussion and semi-fleshed prototypes floating around GitHub, without any actual development (AFAIK) on a swift-corelibs-math library.
For example - all of this discussion about how to implement a Complex type; this is all great. Where is it leading? I think there is pretty solid agreement that we're not going to add this to the standard library, important though it is.
I would suggest we starting creating an official math library repository now, and create a sub-forum where pitches like these have a realistic chance of becoming part of the library. We can call it "beta" or "experimental" until it has enough features and a stable API, but we really should start trying something instead of talking.
Swift is a young language, but there is so much community interest that this seems like something we should still be able to tackle.
I couldn't agree more that the situation is somewhat ridiculous -- but that's how people are quite often.
... but community interest isn't personal interest:
As you say, there are many semi-fleshed prototypes, and it won't help if someone without much reputation creates another one.
For me, the current situation indicates that Swift has no true community yet; instead, it has many people who follow their own goals.
Fame and glory (maybe that's a little bit exaggerated for "many stars on github" ) is probably one of those goals, and so people start there own project, instead of contributing to the project of someone else ("why should I empower that other guy to earn the fame for my commits? On top of that, he choses strange names for his variables!").
So, would could we do?
We could all become more humble -- but that would be a solution to problems much older than Swift evolution, so I won't count on that ;-)
I'm still convinced that it would be enough (or at least that it could be enough ;-) if someone from Core would "bless" a single repository to become swift-corelibs-whatever -- but that experiment isn't scheduled.
Maybe we could create a repo without a real owner? Or pass ownership along contributors?