Proposal seed: gathering data to fix the NSUInteger inconsistency

Hey, swift-evolution. I want to draw attention to one of the oddest parts of the Objective-C importer today: NSUInteger. TLDR: NSUInteger is treated differently based on whether it comes from a system framework or a user-provided header file. I think this is silly and that we should treat it consistently everywhere, but I haven’t had time to go collect data to demonstrate that this is a safe change. Would someone like to take that on?

If so, read on. (Or jump to the last section, and read these “Background” sections later.)

## Background: Importing Integer Types from C

As everyone is familiar, the importer maps certain “known” Objective-C types to the Swift types. This includes some mostly non-controversial mappings:

- Mapping fixed-sized integers: ‘int32_t' to ‘Int32'
- Mapping common C types to fixed-sized integers: ‘unsigned short’ to ‘UInt16’
- Mapping C’s ‘long’ to Swift's ‘Int’.*
- Mapping ‘intptr_t’ and ‘ptrdiff_t’ to ‘Int’ and ‘uintptr_t’ to ‘UInt'
- Mapping ‘NSInteger’ (and ‘CFIndex’) to ‘Int’

* ‘long’ is a pointer-sized integer on all common modern platforms except 64-bit Windows; we’ll have to do something different there. (‘CLong’ will always be the right type.)

And a few controversial ones:

- Both ‘size_t’ and ‘rsize_t’ are mapped to ‘Int’, not ‘UInt’. This is a pragmatic decision based on Swift’s disallowing of mixed-sign arithmetic and comparisons; if size_t and rsize_t really are used to represent sizes or counts in memory, they will almost certainly never be greater than Int.max. It’s definitely a tradeoff, though.

And finally we come to the strangest one, NSUInteger.

## Background: NSUInteger

In (Objective-)C, NSUInteger is defined to be a word-sized unsigned integer without any stated purpose, much like uintptr_t. It conventionally gets used

1. to represent a size or index in a collection
2. as the base type of an enum defined with NS_OPTIONS
3. to store hash-like values
4. to store semantically-nonnegative 32-bit values, casually (as a compiler writer I’d suggest uint32_t instead)
5. to store semantically-nonnegative 64-bit values, casually (definitely not portable, would suggest uint64_t)
6. to store opaque identifiers known to be 32 or 64 bits (like 3, but either wasting space or non-portable)

(1) is actually the problematic case. Foundation fairly consistently uses NSUInteger for its collections, but UIKit and AppKit use NSInteger, with -1 as a common sentinel. In addition, the standard constant NSNotFound is defined to have a value of Int.max, so that it’s consistent whether interpreted as a signed or unsigned value.

For (2), the code really just wants a conveniently-sized unsigned value to use as a bitfield. In this case the importer consistently treats NSUInteger as UInt. We’re not going to talk about this case any more.

(3) is a lot like (2), except we don’t actually care what the sign of the value is. We just want to vary as many bits as possible when constructing a hash value; we don’t usually try to sort them (or add them, or compare them).

(4) is interesting; it’s entirely possible to have 32-bit counters that go past Int32.max. It’s not common, but it’s possible.

(5) seems much less likely than (4). Int64.max is really high, and if you’re already up in that range I’m not sure another bit will do you any good.

(6) is basically the same as (3); we don’t plan on interpreting these bits, and so we don’t really care what sign the type has.

Because of this, and especially because of the Foundation/*Kit differences, in Swift 1 we decided to import NSUInteger as Int, but only in system frameworks (and when not used as the raw type of an enum). In user frameworks, NSUInteger is consistently imported as UInt.

## The Problem

This is inconsistent. User frameworks should not have different rules from system frameworks. I’d like to propose that NSUInteger be imported as Int everywhere (except when used as the raw type of an enum). It’s not a perfect decision, but it is a pragmatic one, given Swift being much stricter about mixing signedness than C. I’d hope the logic above convinces you that it won’t be a disaster, either—it hasn’t been for Apple’s frameworks.

The recommended idiom for “no, really a word-sized unsigned integer” would be ‘uintptr_t’, but unless you are actually trying to store a pointer as an integer it’s likely that uint32_t or uint64_t would be a better C type to use anyway.

For people who would suggest that Swift actually take unsigned integers seriously instead of using ‘Int’ everywhere, I sympathize, but I think that ship has sailed—not with us, but with all the existing UIKit code that uses NSInteger for counters. Consistently importing NSUInteger as UInt would be a massive source-break in Swift 4 that just wouldn’t be worth it. Given that, is it better to more closely model what’s in user headers, or to have consistency between user and system headers?

(All of this would only apply to Swift 4 mode. Swift 3 compatibility mode would continue to do the funny thing Swift has always done.)

## The Request

Consistently importing NSUInteger as Int would be a pretty major change to how we import existing Objective-C code, and has the potential to break all sorts of mixed-source projects, or even just projects with Objective-C dependencies (perhaps longstanding CocoaPods). Because of this, I’ve held off on proposing it for…a long time now. The last piece, I think, is to find out how Objective-C projects are using NSUInteger in their headers:

- Do they have no NSUIntegers at all?
- Are they using NSUInteger because they’re overriding something that used NSUInteger, or implementing a protocol method that used NSUInteger?
- Are they using NSUInteger as an opaque value, where comparisons and arithmetic are uninteresting?
- Are they using NSUInteger as an index or count of something held in memory?
- Are they using NSUInteger as the raw value of an NS_OPTIONS enum?
- Or is it something else? (These are the most interesting cases, which we probably want to write down.)

If the answers all land in one of these buckets, or even 90% in one of these buckets, then I think we’d be safe in proposing this change; if it turns out there are many interesting uses I didn’t account for, then of course we won’t. But I do think we need to do this reaserch.

Is someone willing to go look at modern CocoaPods and sample code, and ask other developers from major Swift-using companies, to find out how they’re using NSUInteger? And then write up their methodology and report back to us at swift-evolution. If you do this, I will be quite grateful.

Thank you!
Jordan

P.S. For Apple folks, this is rdar://problem/20347922 <rdar://problem/20347922>.

I have a framework I wrote that maps Objective C objects to sqlite records - deriving sqlite schema definitions from property definitions. You simply derive model classes from my base class Model and the base class will introspect the properties and handle all the sql for you. A little like CoreData but the property definitions are used for the meta model instead of an external model file and it is a lot leaner and natural feeling.

I picked NSUInteger for the auto incremented primary key because, after all, it would never go negative.

However, when I tried to import this framework into Swift and use Model as a base class for a Swift class, I found it nearly impossible to satisfy the compiler about mixed mode comparisons and ultimately changed the type to NSInteger.

I was not happy about it and if I wasn't the framework author I would have thought harder about changing it.

···

On Feb 1, 2017, at 17:29, Jordan Rose via swift-evolution <swift-evolution@swift.org> wrote:

find out how Objective-C projects are using NSUInteger in their headers:

- Do they have no NSUIntegers at all?
- Are they using NSUInteger because they’re overriding something that used NSUInteger, or implementing a protocol method that used NSUInteger?
- Are they using NSUInteger as an opaque value, where comparisons and arithmetic are uninteresting?
- Are they using NSUInteger as an index or count of something held in memory?

Hey,

for completeness’ sake, I wrote a JIRA issue on this about a year ago, as suggested by Chris Lattner:

Opening that issue was the result of a question I asked on swift-users:
https://lists.swift.org/pipermail/swift-users/Week-of-Mon-20160111/000848.html

… where you answered with an explanation of the situation:
https://lists.swift.org/pipermail/swift-users/Week-of-Mon-20160118/000884.html

Also, about a year later I can report that I didn’t run into this NSUInteger-as-Int oddity anymore, besides the one mentioned in my message from a year ago. That was a case of overriding a framework class: NSSegmentedCell uses NSUInteger-as-Int, but my subclass thereof uses NSUInteger-as-UInt.

Regards,

Marco

···

On 2017-02-02, at 02:29, Jordan Rose via swift-evolution <swift-evolution@swift.org> wrote:

Hey, swift-evolution. I want to draw attention to one of the oddest parts of the Objective-C importer today: NSUInteger. TLDR: NSUInteger is treated differently based on whether it comes from a system framework or a user-provided header file. I think this is silly and that we should treat it consistently everywhere, but I haven’t had time to go collect data to demonstrate that this is a safe change. Would someone like to take that on?

If so, read on. (Or jump to the last section, and read these “Background” sections later.)

## Background: Importing Integer Types from C

As everyone is familiar, the importer maps certain “known” Objective-C types to the Swift types. This includes some mostly non-controversial mappings:

- Mapping fixed-sized integers: ‘int32_t' to ‘Int32'
- Mapping common C types to fixed-sized integers: ‘unsigned short’ to ‘UInt16’
- Mapping C’s ‘long’ to Swift's ‘Int’.*
- Mapping ‘intptr_t’ and ‘ptrdiff_t’ to ‘Int’ and ‘uintptr_t’ to ‘UInt'
- Mapping ‘NSInteger’ (and ‘CFIndex’) to ‘Int’

* ‘long’ is a pointer-sized integer on all common modern platforms except 64-bit Windows; we’ll have to do something different there. (‘CLong’ will always be the right type.)

And a few controversial ones:

- Both ‘size_t’ and ‘rsize_t’ are mapped to ‘Int’, not ‘UInt’. This is a pragmatic decision based on Swift’s disallowing of mixed-sign arithmetic and comparisons; if size_t and rsize_t really are used to represent sizes or counts in memory, they will almost certainly never be greater than Int.max. It’s definitely a tradeoff, though.

And finally we come to the strangest one, NSUInteger.

## Background: NSUInteger

In (Objective-)C, NSUInteger is defined to be a word-sized unsigned integer without any stated purpose, much like uintptr_t. It conventionally gets used

1. to represent a size or index in a collection
2. as the base type of an enum defined with NS_OPTIONS
3. to store hash-like values
4. to store semantically-nonnegative 32-bit values, casually (as a compiler writer I’d suggest uint32_t instead)
5. to store semantically-nonnegative 64-bit values, casually (definitely not portable, would suggest uint64_t)
6. to store opaque identifiers known to be 32 or 64 bits (like 3, but either wasting space or non-portable)

(1) is actually the problematic case. Foundation fairly consistently uses NSUInteger for its collections, but UIKit and AppKit use NSInteger, with -1 as a common sentinel. In addition, the standard constant NSNotFound is defined to have a value of Int.max, so that it’s consistent whether interpreted as a signed or unsigned value.

For (2), the code really just wants a conveniently-sized unsigned value to use as a bitfield. In this case the importer consistently treats NSUInteger as UInt. We’re not going to talk about this case any more.

(3) is a lot like (2), except we don’t actually care what the sign of the value is. We just want to vary as many bits as possible when constructing a hash value; we don’t usually try to sort them (or add them, or compare them).

(4) is interesting; it’s entirely possible to have 32-bit counters that go past Int32.max. It’s not common, but it’s possible.

(5) seems much less likely than (4). Int64.max is really high, and if you’re already up in that range I’m not sure another bit will do you any good.

(6) is basically the same as (3); we don’t plan on interpreting these bits, and so we don’t really care what sign the type has.

Because of this, and especially because of the Foundation/*Kit differences, in Swift 1 we decided to import NSUInteger as Int, but only in system frameworks (and when not used as the raw type of an enum). In user frameworks, NSUInteger is consistently imported as UInt.

## The Problem

This is inconsistent. User frameworks should not have different rules from system frameworks. I’d like to propose that NSUInteger be imported as Int everywhere (except when used as the raw type of an enum). It’s not a perfect decision, but it is a pragmatic one, given Swift being much stricter about mixing signedness than C. I’d hope the logic above convinces you that it won’t be a disaster, either—it hasn’t been for Apple’s frameworks.

The recommended idiom for “no, really a word-sized unsigned integer” would be ‘uintptr_t’, but unless you are actually trying to store a pointer as an integer it’s likely that uint32_t or uint64_t would be a better C type to use anyway.

For people who would suggest that Swift actually take unsigned integers seriously instead of using ‘Int’ everywhere, I sympathize, but I think that ship has sailed—not with us, but with all the existing UIKit code that uses NSInteger for counters. Consistently importing NSUInteger as UInt would be a massive source-break in Swift 4 that just wouldn’t be worth it. Given that, is it better to more closely model what’s in user headers, or to have consistency between user and system headers?

(All of this would only apply to Swift 4 mode. Swift 3 compatibility mode would continue to do the funny thing Swift has always done.)

## The Request

Consistently importing NSUInteger as Int would be a pretty major change to how we import existing Objective-C code, and has the potential to break all sorts of mixed-source projects, or even just projects with Objective-C dependencies (perhaps longstanding CocoaPods). Because of this, I’ve held off on proposing it for…a long time now. The last piece, I think, is to find out how Objective-C projects are using NSUInteger in their headers:

- Do they have no NSUIntegers at all?
- Are they using NSUInteger because they’re overriding something that used NSUInteger, or implementing a protocol method that used NSUInteger?
- Are they using NSUInteger as an opaque value, where comparisons and arithmetic are uninteresting?
- Are they using NSUInteger as an index or count of something held in memory?
- Are they using NSUInteger as the raw value of an NS_OPTIONS enum?
- Or is it something else? (These are the most interesting cases, which we probably want to write down.)

If the answers all land in one of these buckets, or even 90% in one of these buckets, then I think we’d be safe in proposing this change; if it turns out there are many interesting uses I didn’t account for, then of course we won’t. But I do think we need to do this reaserch.

Is someone willing to go look at modern CocoaPods and sample code, and ask other developers from major Swift-using companies, to find out how they’re using NSUInteger? And then write up their methodology and report back to us at swift-evolution. If you do this, I will be quite grateful.

Thank you!
Jordan

P.S. For Apple folks, this is rdar://problem/20347922 <rdar://problem/20347922>.

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

I’ve always considered our handling of signed/unsigned numbers a huge deficiency of Swift.

I appreciate that NSNotFound makes things difficult, but at the same time, eliminating these sentinel values is one of the big benefits for Obj-C developers of moving to Swift. It’s a bit disheartening that Apple’s own frameworks can’t use that. I wonder if the better answer wouldn’t be to introduce an analogue of Optional<NSUInteger> to Objective-C and supplement the signed-with-sentinel methods with optional-unsigned ones, which we would prefer when importing. 3rd-party code could migrate to that new type so that they can be imported in to Swift without sentinels.

That could cause source-breakage for Swift code, but I remember that members of the core-team have said several times that they would like some kind of safe mixed-type arithmetic and comparison to become part of the language one day. I feel that that is ultimately the better way to go in the long-run. I’ve always found it strange that Swift often promotes safety over convenience (e.g. exhaustive switches, trapping on overflow), but then in this one case takes the exact opposite approach.

Since I’m on the subject, I’d also support Array<T>’s index becoming a UInt and all of its index-related methods becoming generic to take any kind of integer. Or, to put it another way, if we could make Array.Index == Any<IntegerProtocol> (an existential), so you could feed indexes in and pull them out in whichever integer type you need — after all, the exact type of the index is not necessarily fundamental to what an Array represents; all that matters is that it’s elements have a contiguous space of integrally-advancing offsets.

- Karl

···

On 2 Feb 2017, at 02:29, Jordan Rose via swift-evolution <swift-evolution@swift.org> wrote:

For people who would suggest that Swift actually take unsigned integers seriously instead of using ‘Int’ everywhere, I sympathize, but I think that ship has sailed—not with us, but with all the existing UIKit code that uses NSInteger for counters. Consistently importing NSUInteger as UInt would be a massive source-break in Swift 4 that just wouldn’t be worth it. Given that, is it better to more closely model what’s in user headers, or to have consistency between user and system headers?

Shouldn't NSUInteger always become UInt in swift?

···

On Thu, Feb 2, 2017 at 12:07 AM Freak Show via swift-evolution < swift-evolution@swift.org> wrote:

I have a framework I wrote that maps Objective C objects to sqlite records
- deriving sqlite schema definitions from property definitions. You simply
derive model classes from my base class Model and the base class will
introspect the properties and handle all the sql for you. A little like
CoreData but the property definitions are used for the meta model instead
of an external model file and it is a lot leaner and natural feeling.

I picked NSUInteger for the auto incremented primary key because, after
all, it would never go negative.

However, when I tried to import this framework into Swift and use Model as
a base class for a Swift class, I found it nearly impossible to satisfy the
compiler about mixed mode comparisons and ultimately changed the type to
NSInteger.

I was not happy about it and if I wasn't the framework author I would have
thought harder about changing it.

On Feb 1, 2017, at 17:29, Jordan Rose via swift-evolution < > swift-evolution@swift.org> wrote:

*find out how Objective-C projects are using NSUInteger in their headers:*

- Do they have no NSUIntegers at all?
- Are they using NSUInteger because they’re overriding something that used
NSUInteger, or implementing a protocol method that used NSUInteger?
- Are they using NSUInteger as an opaque value, where comparisons and
arithmetic are uninteresting?
- Are they using NSUInteger as an index or count of something held in
memory?

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

Shouldn't NSUInteger always become UInt in swift?

Jordan answers this question in his email:

For people who would suggest that Swift actually take unsigned integers seriously instead of using ‘Int’ everywhere, I sympathize, but I think that ship has sailed—not with us, but with all the existing UIKit code that uses NSInteger for counters. Consistently importing NSUInteger as UInt would be a massive source-break in Swift 4 that just wouldn’t be worth it. Given that, is it better to more closely model what’s in user headers, or to have consistency between user and system headers?

···

On 2 Feb 2017, at 14:52, Derrick Ho via swift-evolution <swift-evolution@swift.org> wrote:

On Thu, Feb 2, 2017 at 12:07 AM Freak Show via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:
I have a framework I wrote that maps Objective C objects to sqlite records - deriving sqlite schema definitions from property definitions. You simply derive model classes from my base class Model and the base class will introspect the properties and handle all the sql for you. A little like CoreData but the property definitions are used for the meta model instead of an external model file and it is a lot leaner and natural feeling.

I picked NSUInteger for the auto incremented primary key because, after all, it would never go negative.

However, when I tried to import this framework into Swift and use Model as a base class for a Swift class, I found it nearly impossible to satisfy the compiler about mixed mode comparisons and ultimately changed the type to NSInteger.

I was not happy about it and if I wasn't the framework author I would have thought harder about changing it.

On Feb 1, 2017, at 17:29, Jordan Rose via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:

find out how Objective-C projects are using NSUInteger in their headers:

- Do they have no NSUIntegers at all?
- Are they using NSUInteger because they’re overriding something that used NSUInteger, or implementing a protocol method that used NSUInteger?
- Are they using NSUInteger as an opaque value, where comparisons and arithmetic are uninteresting?
- Are they using NSUInteger as an index or count of something held in memory?

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org <mailto:swift-evolution@swift.org>
https://lists.swift.org/mailman/listinfo/swift-evolution
_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

What if we import it as UInt, but have an annotation that the frameworks can add saying that it should be imported as Int instead? Then have a bot apply that annotation to the system frameworks where it would import it as Int in the current system. Then there are no source breaking changes (beyond frameworks where you explicitly choose to remove the annotation because the breaking change is considered better than the status quo).

Thanks,
Jon

···

On Feb 2, 2017, at 6:11 AM, David Hart via swift-evolution <swift-evolution@swift.org> wrote:

On 2 Feb 2017, at 14:52, Derrick Ho via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:

Shouldn't NSUInteger always become UInt in swift?

Jordan answers this question in his email:

For people who would suggest that Swift actually take unsigned integers seriously instead of using ‘Int’ everywhere, I sympathize, but I think that ship has sailed—not with us, but with all the existing UIKit code that uses NSInteger for counters. Consistently importing NSUInteger as UInt would be a massive source-break in Swift 4 that just wouldn’t be worth it. Given that, is it better to more closely model what’s in user headers, or to have consistency between user and system headers?

I don’t think we can sell such an annotation. We’d have to first generate a vast number of diffs (or API notes files) for all the APIs currently affected by this, then ask Apple framework owners to review and maintain them going forward. I think that’s a non-starter for us.

Jordan

···

On Feb 2, 2017, at 07:33, Jonathan Hull <jhull@gbis.com> wrote:

On Feb 2, 2017, at 6:11 AM, David Hart via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:

On 2 Feb 2017, at 14:52, Derrick Ho via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:

Shouldn't NSUInteger always become UInt in swift?

Jordan answers this question in his email:

For people who would suggest that Swift actually take unsigned integers seriously instead of using ‘Int’ everywhere, I sympathize, but I think that ship has sailed—not with us, but with all the existing UIKit code that uses NSInteger for counters. Consistently importing NSUInteger as UInt would be a massive source-break in Swift 4 that just wouldn’t be worth it. Given that, is it better to more closely model what’s in user headers, or to have consistency between user and system headers?

What if we import it as UInt, but have an annotation that the frameworks can add saying that it should be imported as Int instead? Then have a bot apply that annotation to the system frameworks where it would import it as Int in the current system. Then there are no source breaking changes (beyond frameworks where you explicitly choose to remove the annotation because the breaking change is considered better than the status quo).

Yeah, that's what I want - more "annotations" cluttering up my Objective C headers to make Swift happy.

No thanks. There's enough noise introduced already.

···

On Feb 2, 2017, at 07:33, Jonathan Hull via swift-evolution <swift-evolution@swift.org> wrote:

On Feb 2, 2017, at 6:11 AM, David Hart via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:

On 2 Feb 2017, at 14:52, Derrick Ho via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:

Shouldn't NSUInteger always become UInt in swift?

Jordan answers this question in his email:

For people who would suggest that Swift actually take unsigned integers seriously instead of using ‘Int’ everywhere, I sympathize, but I think that ship has sailed—not with us, but with all the existing UIKit code that uses NSInteger for counters. Consistently importing NSUInteger as UInt would be a massive source-break in Swift 4 that just wouldn’t be worth it. Given that, is it better to more closely model what’s in user headers, or to have consistency between user and system headers?

What if we import it as UInt, but have an annotation that the frameworks can add saying that it should be imported as Int instead? Then have a bot apply that annotation to the system frameworks where it would import it as Int in the current system. Then there are no source breaking changes (beyond frameworks where you explicitly choose to remove the annotation because the breaking change is considered better than the status quo).

Thanks,
Jon
_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

It'd be cluttering up Apple's code, not yours (unless you work for them, of course).

- Dave Sweeris

···

On Feb 2, 2017, at 09:52, Freak Show via swift-evolution <swift-evolution@swift.org> wrote:

Yeah, that's what I want - more "annotations" cluttering up my Objective C headers to make Swift happy.

No thanks. There's enough noise introduced already.