It seems a bit silly to copy-paste-proliferate such a fundamental numeric type, when it's already in the standard library, it just needs to be released. I suspect it's already the goal to do that, though, so I hope the question is merely: ETA?
To be clear: Foundation is not considering adding their own. They're recommending using the existing DoubleWidth test support from Numerics for testing purposes (at my urging). We will pitch Int128 for the stdlib soon, but even once it's available, Foundation and Numerics should still use the double width type for testing purposes due to availability limitations (at least in the medium term). So that's the correct path forward no matter what.
For the stdlib, there's a big falloff in use cases after 128b. Larger fixed-width integer types are interesting, but probably belong in a separate module or package (I'll expand on this in the alternatives considered section of the pitch for 128b integers).
Any chance we can steal the whatever machinery C's _BitInt uses to get arbitrary fixed-width integer types? Obviously, this would be most useful with generic value parameters, but even a standard library only magic type would be useful for people who need bigger integers.
You, uh, might want to look at the codegen that _BitInt currently produces with most mainstream compilers before getting too enthusiastic about doing that =)
I’d say there is a big falloff over 256 bits, but given that most Elliptic Curve cryptography, specifically the curves secp256k1 uses by the vast majority of all cryptocurrencies and also Curve25519 also with heavy usage rely on 256 bit sizes finite fields, it would be very unfortunate to let the cutoff be 128 and not 256.
Crypto tends to have fairly specialized requirements on its arithmetic, however, that a general-purpose type does not satisfy. One would not implement crypto primitives using a type that doesn't explicitly guarantee certain types of side-channel resistance, at a bare minimum. You would also like to ensure that copies of sensitive data are not left sitting around on the stack or in register, which is a near-impossible guarantee to make with Swift's normal semantics.
Once you leave crypto applications to specialized libraries, there are very few clients for fixed-size 256b integers.
Ideally, crypto primitives should not be written in Swift (or normal C or C++ or Rust, for that matter), but this line of discussion goes off-topic pretty quickly.
I don’t understand how that squares with Swift’s desire to be a general-purpose programming language. Crypto algorithms have to be written in some language, and they’re currently written in C, C++, Rust, Java, C#, and many other languages. Surely we’re not saying that Swift is completely off the table for writing crypto?
So then I return to my question. Given that Swift is no worse than other languages crypto algorithms are written in, should crypto algorithms written in Swift avoid types like Int?
Edit: Steve and I were talking past each other a bit. To avoid leaving dust on the stack, control against side-chain exploits, etc., you really need to control the exact instructions in the core of your crypto algorithms, which necessitates writing them in assembly. I don’t dispute that, but you eventually have to surface it at a higher level, and I’m curious at what point it becomes “safe” to traffic in the normal datatypes.
One can use Builtin.Int${N} with N up to 2048 and wrap it in a struct just like stdlib does. This is iN type from llvm (docs). llvm will do all the heavy lifting.
Example:
Unfortunately there're some inconvenience ahead:
Swift doesn't support compile time constants as generic parameters, so we can't generalize an implementation.
BinaryInteger.distance(to:) and co return Int, so they're problematic
I'm aware their implementation strategy is somewhat different, but ultimately (for most purposes) the net result is the same - a 128-bit wide integer. I also don't think the fact that they're only used internally, for unit tests, negates the general problem of copy-paste reuse.
An important and regrettable point.
Would it make sense, then, to at least put DoubleWidth into its own Swift package (e.g. apple/DoubleWidth) given there are at least two Apple packages that [want to] use it? That way there'll at least only be one implementation of that specific type.
I realise that will then make it available to the community at large, but, so? It needs to work properly anyway - especially given it's relied on in unit tests for essential Apple libraries - so more users & more eyes on it will only help.
It doesn’t really rise to the level of a thing we want to put in a high-visibility package. If someone else wants to create one, the Swift license permits it and they should feel free to do so. For Numerics and Foundation, I think we’re fine with having two copies.
(Foundation and Numerics both have some weird Apple-specific layering restrictions because they’re pretty low-level, which feeds my reluctance to add dependencies for either, but which other packages shouldn’t need to care about at all.)
Wait a sec! Is DoubleWidth public? Can we use it outside of Apple? I am talking about the Swift Numerics implementation.
I always thought that it is private and used only for testing, so I never filed any issues. There a few ways to crash it, but I was: “well… it is a private type used only for tests, and it works for them, so whatever”.
Should I create an issue? I’m pretty busy at the moment, so this may take ~2 weeks.
It's not exported in any of Apple's packages. What @scanon was saying is that people are free to copy-paste it into a new package, or just their own app directly, and run with it from there.