I'm wondering if there are any best practices regarding integer types when interfacing with C APIs.
The C functions I'm using expect integers of the types UInt8, UInt32 and Int32. Should I convert these to/from Int for use in the rest of my app? If so, what are the performance implications of doing so? Or should I stick with these fixed width types and avoid any conversions?
I think it depends a little on what type of API you are going for. If you are trying to hide the fact that you are interfacing with a low-level API sticking with Int might be a good idea.
Personally, I would probably stick with the specific widths so that you can avoid conversion overflows, etc. It does away with a whole class of errors that way. It does, however, mean there is going to be a little bit of impedance mismatch between idiomatic Swift though.
Can you post some concrete examples of APIs that you’re on the fence about? Because I agree with Ray_Fix that this really depends on what the API looks like.
For example, if I were wrapping AuthorizationCopyRights, which has a declaration like this:
The out_token parameter returns a CInt, aka Int32, in Swift but I wouldn’t return that directly to my clients. Rather, I’d embed it in an opaque structure to ensure it’s used in a type-safe manner.
Share and Enjoy
Quinn “The Eskimo!” @ DTS @ Apple
[1] If the API has used size_t in C it would have imported properly in the first place.
I'm using SDL and its related projects. Many APIs just use int in C:
typedef struct SDL_Point
{
int x;
int y;
} SDL_Point;
In Swift, the SDL_Point initializer becomes:
SDL_Point(x: Int32, y: Int32)
For now, I'm modelling this as follows, just sticking with Int32:
public struct Point {
public var x: Int32
public var y: Int32
public init(x: Int32, y: Int32) {
self.x = x
self.y = y
}
func toSDL() -> SDL_Point {
SDL_Point(x: x, y: y)
}
}
Note that my goal is to completely hide all the underlying (imported) SDL types and expose only Swift types in the public API, which is why I don't just use SDL_Point in my code.
There is a nice piece of documentation regarding creation of Swift interface for C files (via ClangImporter).
I can't provide you with facts but my assumption (as an user of Swift C interface) is, that (for example) both int32_t and Int32 are represented by the same type from the Builtin module (Swift's mysterious Builtin module – ankit.im) during compilation and therefore, I would assume no conversion is emmited by the compiler itself.
Unlike with (for example) String to const char *, which takes advantage of "compiler inserted Swift methods" like _convertConstStringToUTF8PointerArgument listed in swift/KnownDecls.def at main · apple/swift · GitHub.
I would like to point out, that I really base all of the above on my assumptions about how the compiler works based on some googling and clicking on random things in the Swift repo. Nevertheless, I don't have any reason to doubt myself :D
If that AuthorizationItemSet thing was an 'in' parameter to the C API, would you still use an Int? I would guess not, because then the Swift version could express things that you couldn't pass on to the C version.
The situation I struggle with around data structures is API evolution -- the C library author is free to add more APIs that use the data structures in different directions. So I tend to feel that replicating the C widths is the way to go.
I agree with Ray_Fix that reducing the friction of interop with Swift code isn't worth the hidden rounding/coercion errors. Types are our friends...
My recent experience has been with wrapping a large C API that uses a variety of integer widths, sometimes consistently. The first pass pushed things to Int almost everywhere but looking at the results and bumping into these in/out things along the way it feels clear to me that this was the wrong choice and it's better to reflect the actual API constraints.
When the C interface uses C types like int, long, unsigned long and so-on, you should avoid using the types the clang importer gives you. This is because those types can (technically) differ on different platforms. Swift provides handy typealiases: CInt, CLong, CUnsignedLong, CShort and so-on. You should prefer these where those are the types used in the C interface. For SDL_Point, you'd want CInt.
The importer originally did do that, but we eventually decided to back off from it because it didn’t add any information for someone writing a pure Swift app. On all the platforms Swift supports, short, int, and long long are Int16, Int32, and Int64.*
The exceptions here are char and long. char is annoying because it might be signed or unsigned depending on the platform. long is conveniently the same as Swift’s Int…except on Windows. But just having CLong and using standard types for the rest would have been weird.
I don’t know if this was the right decision or not, but it’s interesting that there was a change and the change went the direction it did.
* Though I guess people have experimented with Swift on 16-bit platforms…
I'll stick to the fixed width types for now, and keep in mind the platform differences with long. I've only encountered one long so far, so hopefully, it's not that big of an issue.
If that AuthorizationItemSet thing was an 'in' parameter to the C
API, would you still use an Int?
Yes.
then the Swift version could express things that you couldn't pass on
to the C version.
Yeah, I’m happy to trade that limitation off against the usability improvements for my Swift clients.
There are two ways to handle this in the shim:
Trap
Throw
Most of the authorisation APIs can throw an error, so the second option is feasible. And it’s also pretty easy to do:
let count: Int = …
guard let count32 = UInt32(exactly: count) else {
throw … something …
}
In situations where it’s not possible to throw an error, or I just can’t be bothered, trapping is fine. It’s easy to implement — the natural syntax, UInt32(count), does it — and I’m OK with traps like this because they are very unlikely to crop up in working code. Indeed, the most likely cause of the trap is someone trying to actively attack me (-: