Why hard-code octet-sized bytes?


(Daryle Walker) #1

When I first looked into Swift, I noticed that the base type was called “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers have followed the bog standard 8/16/32(/64) architecture for decades, but why hard code it into the language/library? Why should 36-bit processors with 9-bit bytes, or processors that start at 16 bits, be excluded right off the bat? Did you guys see a problem with how (Objective-)C(++) had to define its base types in a mushy way to accommodate the possibility non-octet bytes?

BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in the library? Or are we supposed to hard code an “8”?

···


Daryle Walker
Mac, Internet, and Video Game Junkie
darylew AT mac DOT com


(Robert Widmann) #2

You raise an interesting point. To explore this further: we could definitely just lower a lot of it to the appropriate integer-width arithmetic in LLVM. I suspect the limitations of the standard library implementation you bring up exist because "nonstandard" types such as these don't show up when we have to bridge C and ObjC so it isn't as much a priority that we generalize over the entire space. Doing so would also seem to require the ability to use, say, integer literals in generics like C++.

As for the char size issue, we define both sizeof and a platform-dependent CChar typealias that you can measure against.

~Robert Widmann

2016/06/17 13:01、Daryle Walker via swift-evolution <swift-evolution@swift.org> のメッセージ:

···

When I first looked into Swift, I noticed that the base type was called “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers have followed the bog standard 8/16/32(/64) architecture for decades, but why hard code it into the language/library? Why should 36-bit processors with 9-bit bytes, or processors that start at 16 bits, be excluded right off the bat? Did you guys see a problem with how (Objective-)C(++) had to define its base types in a mushy way to accommodate the possibility non-octet bytes?

BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in the library? Or are we supposed to hard code an “8”?


Daryle Walker
Mac, Internet, and Video Game Junkie
darylew AT mac DOT com

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


(Félix Cloutier) #3

Out of curiosity, can you name an architecture that doesn't use 8-bit bytes?

Félix

···

Le 17 juin 2016 à 13:01:33, Daryle Walker via swift-evolution <swift-evolution@swift.org> a écrit :

When I first looked into Swift, I noticed that the base type was called “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers have followed the bog standard 8/16/32(/64) architecture for decades, but why hard code it into the language/library? Why should 36-bit processors with 9-bit bytes, or processors that start at 16 bits, be excluded right off the bat? Did you guys see a problem with how (Objective-)C(++) had to define its base types in a mushy way to accommodate the possibility non-octet bytes?

BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in the library? Or are we supposed to hard code an “8”?


Daryle Walker
Mac, Internet, and Video Game Junkie
darylew AT mac DOT com

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


(Saagar Jha) #4

I'm not quite sure what you mean. Swift has a type called Int8 that
represents numbers from -128 to 127 using 8 bits. I don't see how this
"excludes" computers.

···

On Fri, Jun 17, 2016 at 13:01 Daryle Walker via swift-evolution < swift-evolution@swift.org> wrote:

When I first looked into Swift, I noticed that the base type was called
“UInt8” (and “Int8”) and not something like “Byte.” I know modern
computers have followed the bog standard 8/16/32(/64) architecture for
decades, but why hard code it into the language/library? Why should 36-bit
processors with 9-bit bytes, or processors that start at 16 bits, be
excluded right off the bat? Did you guys see a problem with how
(Objective-)C(++) had to define its base types in a mushy way to
accommodate the possibility non-octet bytes?

BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in
the library? Or are we supposed to hard code an “8”?


Daryle Walker
Mac, Internet, and Video Game Junkie
darylew AT mac DOT com

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

--
-Saagar Jha


(Chris Lattner) #5

Given that there are no 9-bit byte targets supported by Swift (or LLVM), it would be impossible to test that configuration, and it is well known that untested code doesn’t work. As such, introducing a Byte type which is potentially not 8 bits in size would only add cognitive overload. Any promised portability benefit would simply mislead people.

-Chris

···

On Jun 17, 2016, at 1:01 PM, Daryle Walker via swift-evolution <swift-evolution@swift.org> wrote:

When I first looked into Swift, I noticed that the base type was called “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers have followed the bog standard 8/16/32(/64) architecture for decades, but why hard code it into the language/library? Why should 36-bit processors with 9-bit bytes, or processors that start at 16 bits, be excluded right off the bat? Did you guys see a problem with how (Objective-)C(++) had to define its base types in a mushy way to accommodate the possibility non-octet bytes?


(David Sweeris) #6

IIRC, a bunch of Ye Olde systems used 6-bit bytes. And I think 36-bit ints were used in a few architectures, but don't quote me on that.

- Dave Sweeris

···

On Jun 17, 2016, at 22:48, Félix Cloutier via swift-evolution <swift-evolution@swift.org> wrote:

Out of curiosity, can you name an architecture that doesn't use 8-bit bytes?

Félix

Le 17 juin 2016 à 13:01:33, Daryle Walker via swift-evolution <swift-evolution@swift.org> a écrit :

When I first looked into Swift, I noticed that the base type was called “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers have followed the bog standard 8/16/32(/64) architecture for decades, but why hard code it into the language/library? Why should 36-bit processors with 9-bit bytes, or processors that start at 16 bits, be excluded right off the bat? Did you guys see a problem with how (Objective-)C(++) had to define its base types in a mushy way to accommodate the possibility non-octet bytes?

BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in the library? Or are we supposed to hard code an “8”?


Daryle Walker
Mac, Internet, and Video Game Junkie
darylew AT mac DOT com

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


(Daryle Walker) #7

I wasn’t proposing adding “Byte,” etc. now, but asking why the originators didn’t go for width-agnostic names for the base numeric types way back when INSTEAD of what we do have. I know that nowadays any violators would be embedded systems that start off using 16- or 32-bits for everything. We’re more likely to go to 128 bits than any non-power-of-2 in the future.

(I don’t know what names we would use besides the “Byte” and “Int” ones. Since the “Int” type is supposed to move up with processor improvements, I guess we should go with “Short,” “ShortShort,” etc. instead of multiple “long”s like C++ did.)

If we had value-based generic arguments, we could have had something like as set of “Integer<width: Int>”, with “exact” and “at least” variants. We would still need “Byte” and “Int” unless we provide constants for the environment’s minimum and optimized bit widths (and even then, the type aliases would be handy).

···

On Jun 19, 2016, at 1:04 AM, Chris Lattner <clattner@apple.com> wrote:

On Jun 17, 2016, at 1:01 PM, Daryle Walker via swift-evolution <swift-evolution@swift.org> wrote:

When I first looked into Swift, I noticed that the base type was called “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers have followed the bog standard 8/16/32(/64) architecture for decades, but why hard code it into the language/library? Why should 36-bit processors with 9-bit bytes, or processors that start at 16 bits, be excluded right off the bat? Did you guys see a problem with how (Objective-)C(++) had to define its base types in a mushy way to accommodate the possibility non-octet bytes?

Given that there are no 9-bit byte targets supported by Swift (or LLVM), it would be impossible to test that configuration, and it is well known that untested code doesn’t work. As such, introducing a Byte type which is potentially not 8 bits in size would only add cognitive overload. Any promised portability benefit would simply mislead people.


Daryle Walker
Mac, Internet, and Video Game Junkie
darylew AT mac DOT com


(Robert Widmann) #8

Old old old architectures. We're talking Multics days.

~Robert Widmann

2016/06/17 21:35、David Sweeris via swift-evolution <swift-evolution@swift.org> のメッセージ:

···

IIRC, a bunch of Ye Olde systems used 6-bit bytes. And I think 36-bit ints were used in a few architectures, but don't quote me on that.

- Dave Sweeris

On Jun 17, 2016, at 22:48, Félix Cloutier via swift-evolution <swift-evolution@swift.org> wrote:

Out of curiosity, can you name an architecture that doesn't use 8-bit bytes?

Félix

Le 17 juin 2016 à 13:01:33, Daryle Walker via swift-evolution <swift-evolution@swift.org> a écrit :

When I first looked into Swift, I noticed that the base type was called “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers have followed the bog standard 8/16/32(/64) architecture for decades, but why hard code it into the language/library? Why should 36-bit processors with 9-bit bytes, or processors that start at 16 bits, be excluded right off the bat? Did you guys see a problem with how (Objective-)C(++) had to define its base types in a mushy way to accommodate the possibility non-octet bytes?

BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in the library? Or are we supposed to hard code an “8”?


Daryle Walker
Mac, Internet, and Video Game Junkie
darylew AT mac DOT com

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution