Bool to Int

It seems I can't do this:

let r = Int(a > b)

but I can do it with a literal:

let r = Int(true)

I'd like to do this to implement signum without branching, but perhaps that's not possible.

···

--
Rick Mann
rmann@latencyzero.com

1 Like

Except in that case true isn’t a Bool but an NSNumber, which is why you can initialize an Int from it. It seems trivially easy to add an Int extension to do what you want though.

Jon

···

On Nov 20, 2016, at 10:48 PM, Rick Mann via swift-users <swift-users@swift.org> wrote:

It seems I can't do this:

let r = Int(a > b)

but I can do it with a literal:

let r = Int(true)

I'd like to do this to implement signum without branching, but perhaps that's not possible.

--
Rick Mann
rmann@latencyzero.com

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

In general this is a correct behaviour, because literals in Swift are untyped. Int does not have any initializer for a Bool so the compiler tries to find a type that might conforms to ExpressibleByBooleanLiteral for all possible initializer of Int (Int.init(_: TYPE)). This resolution decides to go with NSNumber in your case?!

The thing is, when you write Int(a > b), you’re passing a Bool and not a literal anymore. Here the compiler does not fallback to NSNumber anymore and reports you an error, because Int.init(_: Bool) does not exist.

···

--
Adrian Zubarev
Sent with Airmail

Am 21. November 2016 um 04:48:35, Rick Mann via swift-users (swift-users@swift.org) schrieb:

It seems I can't do this:

let r = Int(a > b)

but I can do it with a literal:

let r = Int(true)

I'd like to do this to implement signum without branching, but perhaps that's not possible.

--
Rick Mann
rmann@latencyzero.com

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

Except in that case true isn’t a Bool but an NSNumber, which is why you can initialize an Int from it. It seems trivially easy to add an Int extension to do what you want though.

Is there a way that avoids branching?

So, according to Xcode, "true" and "a > b" both have type "Bool". I don't know why the compiler allows one and not the other, except that it's literal, and I guess there's a "BoolLiteralConvertible" (or equivalent) for the types.

For now I'm doing what I need with branching, but it would be nice to find a more efficient way.

···

On Nov 20, 2016, at 19:52 , Jon Shier <jon@jonshier.com> wrote:

Jon

On Nov 20, 2016, at 10:48 PM, Rick Mann via swift-users <swift-users@swift.org> wrote:

It seems I can't do this:

let r = Int(a > b)

but I can do it with a literal:

let r = Int(true)

I'd like to do this to implement signum without branching, but perhaps that's not possible.

--
Rick Mann
rmann@latencyzero.com

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

--
Rick Mann
rmann@latencyzero.com

Is there a way that avoids branching?

Don’t know. Have you profiled

let r = a > b ? 1 : 0

to know it is an issue?

So, according to Xcode, "true" and "a > b" both have type "Bool". I don't know why the compiler allows one and not the other, except that it's literal, and I guess there's a "BoolLiteralConvertible" (or equivalent) for the types.

You are including foundation which gets you the bridging code.

let r = Int(true)

is an error without foundation. With foundation you get this:

extension NSNumber : ExpressibleByFloatLiteral, ExpressibleByIntegerLiteral, ExpressibleByBooleanLiteral {

    // [snip]

    /// Create an instance initialized to `value`.
    required public convenience init(booleanLiteral value: Bool)
}

and this

extension Int {

    public init(_ number: NSNumber)
}

which when combined make `let r = Int(true)` work. I haven’t profiled the code but would suspect that `let r = a > b ? 1 : 0` might be more efficient.

I'll try profiling it (or looking at the generated assembly).

Thanks!

···

On Nov 20, 2016, at 21:15 , Marco S Hyman <marc@snafu.org> wrote:

Is there a way that avoids branching?

Don’t know. Have you profiled

let r = a > b ? 1 : 0

to know it is an issue?

So, according to Xcode, "true" and "a > b" both have type "Bool". I don't know why the compiler allows one and not the other, except that it's literal, and I guess there's a "BoolLiteralConvertible" (or equivalent) for the types.

You are including foundation which gets you the bridging code.

let r = Int(true)

is an error without foundation. With foundation you get this:

extension NSNumber : ExpressibleByFloatLiteral, ExpressibleByIntegerLiteral, ExpressibleByBooleanLiteral {

   // [snip]

   /// Create an instance initialized to `value`.
   required public convenience init(booleanLiteral value: Bool)
}

and this

extension Int {

   public init(_ number: NSNumber)
}

which when combined make `let r = Int(true)` work. I haven’t profiled the code but would suspect that `let r = a > b ? 1 : 0` might be more efficient.

--
Rick Mann
rmann@latencyzero.com

This is so confusing. "Literals are untyped", but there’s a “BooleanLiteral”, which is obviously of type Boolean.

-Kenny

···

On Nov 21, 2016, at 2:49 AM, Adrian Zubarev via swift-users <swift-users@swift.org> wrote:

In general this is a correct behaviour, because literals in Swift are untyped. Int does not have any initializer for a Bool so the compiler tries to find a type that might conforms to ExpressibleByBooleanLiteral for all possible initializer of Int (Int.init(_: TYPE)). This resolution decides to go with NSNumber in your case?!

The thing is, when you write Int(a > b), you’re passing a Bool and not a literal anymore. Here the compiler does not fallback to NSNumber anymore and reports you an error, because Int.init(_: Bool) does not exist.

--
Adrian Zubarev
Sent with Airmail

Am 21. November 2016 um 04:48:35, Rick Mann via swift-users (swift-users@swift.org) schrieb:

It seems I can't do this:

let r = Int(a > b)

but I can do it with a literal:

let r = Int(true)

I'd like to do this to implement signum without branching, but perhaps that's not possible.

--
Rick Mann
rmann@latencyzero.com

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

1 Like

Agreed.

···

On Nov 21, 2016, at 09:46 , Kenny Leung via swift-users <swift-users@swift.org> wrote:

This is so confusing. "Literals are untyped", but there’s a “BooleanLiteral”, which is obviously of type Boolean.

--
Rick Mann
rmann@latencyzero.com

A literal doesn’t have a type on its own. Instead, a literal is parsed as having infinite precision and Swift’s type inference attempts to infer a type for the literal.

Source

···

--
Adrian Zubarev
Sent with Airmail

Am 21. November 2016 um 18:46:32, Kenny Leung via swift-users (swift-users@swift.org) schrieb:

This is so confusing. "Literals are untyped", but there’s a “BooleanLiteral”, which is obviously of type Boolean.

-Kenny

On Nov 21, 2016, at 2:49 AM, Adrian Zubarev via swift-users <swift-users@swift.org> wrote:

In general this is a correct behaviour, because literals in Swift are untyped. Int does not have any initializer for a Bool so the compiler tries to find a type that might conforms to ExpressibleByBooleanLiteral for all possible initializer of Int (Int.init(_: TYPE)). This resolution decides to go with NSNumber in your case?!

The thing is, when you write Int(a > b), you’re passing a Bool and not a literal anymore. Here the compiler does not fallback to NSNumber anymore and reports you an error, because Int.init(_: Bool) does not exist.

--
Adrian Zubarev
Sent with Airmail

Am 21. November 2016 um 04:48:35, Rick Mann via swift-users (swift-users@swift.org) schrieb:

It seems I can't do this:

let r = Int(a > b)

but I can do it with a literal:

let r = Int(true)

I'd like to do this to implement signum without branching, but perhaps that's not possible.

--
Rick Mann
rmann@latencyzero.com

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

I don’t see what there is to be confused about.

A “literal” is literally a bunch of characters in source code. The compiler
interprets those characters as representing whatever type is appropriate to
the context.

For the case at hand, a boolean literal can be interpreted as any type
which conforms to the ExpressibleByBooleanLiteral protocol. If the context
provides no information, the compiler defaults to interpreting a boolean
literal as representing a Bool.

The situation is similar for every other kind of literal. For example, “2”
defaults to being interpreted as an Int, but if the context requires a
Double then it will be interpreted as a Double. The text “2” does not have
a type of its own.

Nevin

···

On Mon, Nov 21, 2016 at 3:55 PM, Rick Mann via swift-users < swift-users@swift.org> wrote:

> On Nov 21, 2016, at 09:46 , Kenny Leung via swift-users < > swift-users@swift.org> wrote:
>
> This is so confusing. "Literals are untyped", but there’s a
“BooleanLiteral”, which is obviously of type Boolean.

Agreed.

--
Rick Mann
rmann@latencyzero.com

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

I don’t see what there is to be confused about.

A “literal” is literally a bunch of characters in source code. The compiler interprets those characters as representing whatever type is appropriate to the context.

For the case at hand, a boolean literal can be interpreted as any type which conforms to the ExpressibleByBooleanLiteral protocol. If the context provides no information, the compiler defaults to interpreting a boolean literal as representing a Bool.

The situation is similar for every other kind of literal. For example, “2” defaults to being interpreted as an Int, but if the context requires a Double then it will be interpreted as a Double. The text “2” does not have a type of its own.

Except it does, because if I write

  let a = 2

a is of type Int (at least, according to Xcode's code completion). But this gives inconsistent results:

  let t = true

  let a = Int(true)
  let b = Int(t) // Error

I find this to be very inconsistent and confusing.

···

On Nov 21, 2016, at 13:14 , Nevin Brackett-Rozinsky <nevin.brackettrozinsky@gmail.com> wrote:

Nevin

On Mon, Nov 21, 2016 at 3:55 PM, Rick Mann via swift-users <swift-users@swift.org> wrote:

> On Nov 21, 2016, at 09:46 , Kenny Leung via swift-users <swift-users@swift.org> wrote:
>
> This is so confusing. "Literals are untyped", but there’s a “BooleanLiteral”, which is obviously of type Boolean.

Agreed.

--
Rick Mann
rmann@latencyzero.com

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

--
Rick Mann
rmann@latencyzero.com

Except it does, because if I write

  let a = 2

a is of type Int (at least, according to Xcode's code completion).

and if you write

  let b = 2 + 0.5

2 is treated as a double. The type of the literal “2” varies with context. Do you also find that inconsistent and confusing?

But this gives inconsistent results:

  let t = true

  let a = Int(true)
  let b = Int(t) // Error

I find this to be very inconsistent and confusing.

t is a Bool and there is no automatic conversion from Bool to Int.

true is not a Bool. It may be treated as a Bool depending upon context. In the line `let t = true` it is treated as a Bool. In `let a = Int(true)` it is treated as an NSNumber (assuming you import foundation).

Marc

1 Like

Except it does, because if I write

  let a = 2

a is of type Int (at least, according to Xcode's code completion).

and if you write

  let b = 2 + 0.5

2 is treated as a double. The type of the literal “2” varies with context. Do you also find that inconsistent and confusing?

Nope. I can see how the promotion works. Also, Xcode would tell me b is a Double.

But this gives inconsistent results:

  let t = true

  let a = Int(true)
  let b = Int(t) // Error

I find this to be very inconsistent and confusing.

t is a Bool and there is no automatic conversion from Bool to Int.

true is not a Bool. It may be treated as a Bool depending upon context. In the line `let t = true` it is treated as a Bool. In `let a = Int(true)` it is treated as an NSNumber (assuming you import foundation).

That may be what's happening, but it's still confusing and unintuitive. That something is lost in the transitivity of going through a variable, aside from "literalness", is confusing.

And really, it would be nice if the language provided a fast way of getting an number "1" out of a Bool variable true (and 0 out of false). But that conversation is a bigger can of worms than I care to open right now.

···

On Nov 21, 2016, at 15:09 , Marco S Hyman <marc@snafu.org> wrote:

--
Rick Mann
rmann@latencyzero.com

Where is your problem here? It’s simple and easy ;)

extension Integer {
     
    init(_ boolean: Bool) {
         
        self = boolean ? 1 : 0
    }
}

Int(10 > 4)
UInt32(1 <= 2)

···

--
Adrian Zubarev
Sent with Airmail

Am 22. November 2016 um 00:54:47, Rick Mann via swift-users (swift-users@swift.org) schrieb:

On Nov 21, 2016, at 15:09 , Marco S Hyman <marc@snafu.org> wrote:

Except it does, because if I write

let a = 2

a is of type Int (at least, according to Xcode's code completion).

and if you write

let b = 2 + 0.5

2 is treated as a double. The type of the literal “2” varies with context. Do you also find that inconsistent and confusing?

Nope. I can see how the promotion works. Also, Xcode would tell me b is a Double.

But this gives inconsistent results:

let t = true

let a = Int(true)
let b = Int(t) // Error

I find this to be very inconsistent and confusing.

t is a Bool and there is no automatic conversion from Bool to Int.

true is not a Bool. It may be treated as a Bool depending upon context. In the line `let t = true` it is treated as a Bool. In `let a = Int(true)` it is treated as an NSNumber (assuming you import foundation).

That may be what's happening, but it's still confusing and unintuitive. That something is lost in the transitivity of going through a variable, aside from "literalness", is confusing.

And really, it would be nice if the language provided a fast way of getting an number "1" out of a Bool variable true (and 0 out of false). But that conversation is a bigger can of worms than I care to open right now.

--
Rick Mann
rmann@latencyzero.com

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

1 Like

Hi Marc.

My old mechanical engineering prof used to say, “C is easy if you know assembler”.

The fact that such a simple construct does not work and requires such a long explanation - which may still not be understood by a newbie - is a problem that should be addressed.

Even this explanation requires that you “see inside” the compiler to know what it’s “thinking”. And the fact that NSNumber comes into this makes it more interesting. What would be the behaviour (or at least the error message) on Linux, where there is no NSNumber? (or is there? I’m even more confused - have to try it out for myself).

We are also getting complacent when “A literal doesn’t have a type on its own. Instead, a literal is parsed as having infinite precision and Swift’s type inference attempts to infer a type for the literal.” gets condensed down to “literals in Swift are untyped” I don’t think this helps the explanation when there really is a distinction between different types of literals (otherwise there wouldn’t be things like ExpressibleBy*Boolean*Literal).

I think part of it is the way the documentation itself is worded. Another part here is the weird side effect Objective-C compatibility brings into the picture.

I think I’m turning this into a swift-evolution topic:
* should Int(Bool) be supported in the standard library?
** if so, then Int(t) should work here
** if not, then Int(true) should also error to avoid confusion

-Kenny

···

On Nov 21, 2016, at 3:09 PM, Marco S Hyman via swift-users <swift-users@swift.org> wrote:

Except it does, because if I write

  let a = 2

a is of type Int (at least, according to Xcode's code completion).

and if you write

  let b = 2 + 0.5

2 is treated as a double. The type of the literal “2” varies with context. Do you also find that inconsistent and confusing?

But this gives inconsistent results:

  let t = true

  let a = Int(true)
  let b = Int(t) // Error

I find this to be very inconsistent and confusing.

t is a Bool and there is no automatic conversion from Bool to Int.

true is not a Bool. It may be treated as a Bool depending upon context. In the line `let t = true` it is treated as a Bool. In `let a = Int(true)` it is treated as an NSNumber (assuming you import foundation).

Marc
_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

1 Like

Want to see some real magic?

struct A : _ExpressibleByBuiltinIntegerLiteral {
     
    init(_builtinIntegerLiteral value: _MaxBuiltinIntegerType) {}
}

struct B : ExpressibleByIntegerLiteral {
     
    init(integerLiteral value: A) {
         
        print(type(of: value))
    }
}

let b: B = 42 // prints "A"
My integer literel type is now A and not any of the (U)Int(8,16,32,64) family.

···

--
Adrian Zubarev
Sent with Airmail

Am 22. November 2016 um 19:36:26, Kenny Leung via swift-users (swift-users@swift.org) schrieb:

Hi Marc.

My old mechanical engineering prof used to say, “C is easy if you know assembler”.

The fact that such a simple construct does not work and requires such a long explanation - which may still not be understood by a newbie - is a problem that should be addressed.

Even this explanation requires that you “see inside” the compiler to know what it’s “thinking”. And the fact that NSNumber comes into this makes it more interesting. What would be the behaviour (or at least the error message) on Linux, where there is no NSNumber? (or is there? I’m even more confused - have to try it out for myself).

We are also getting complacent when “A literal doesn’t have a type on its own. Instead, a literal is parsed as having infinite precision and Swift’s type inference attempts to infer a type for the literal.” gets condensed down to “literals in Swift are untyped” I don’t think this helps the explanation when there really is a distinction between different types of literals (otherwise there wouldn’t be things like ExpressibleBy*Boolean*Literal).

I think part of it is the way the documentation itself is worded. Another part here is the weird side effect Objective-C compatibility brings into the picture.

I think I’m turning this into a swift-evolution topic:
* should Int(Bool) be supported in the standard library?
** if so, then Int(t) should work here
** if not, then Int(true) should also error to avoid confusion

-Kenny

On Nov 21, 2016, at 3:09 PM, Marco S Hyman via swift-users <swift-users@swift.org> wrote:

Except it does, because if I write

let a = 2

a is of type Int (at least, according to Xcode's code completion).

and if you write

let b = 2 + 0.5

2 is treated as a double. The type of the literal “2” varies with context. Do you also find that inconsistent and confusing?

But this gives inconsistent results:

let t = true

let a = Int(true)
let b = Int(t) // Error

I find this to be very inconsistent and confusing.

t is a Bool and there is no automatic conversion from Bool to Int.

true is not a Bool. It may be treated as a Bool depending upon context. In the line `let t = true` it is treated as a Bool. In `let a = Int(true)` it is treated as an NSNumber (assuming you import foundation).

Marc
_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

Sorry, I really don’t understand what’s going on with this code.

So in that respect, I guess it is magic.

-Kenny

···

On Nov 22, 2016, at 11:21 AM, Adrian Zubarev <adrian.zubarev@devandartist.com> wrote:

Want to see some real magic?

struct A : _ExpressibleByBuiltinIntegerLiteral {
     
    init(_builtinIntegerLiteral value: _MaxBuiltinIntegerType) {}
}

struct B : ExpressibleByIntegerLiteral {
     
    init(integerLiteral value: A) {
         
        print(type(of: value))
    }
}

let b: B = 42 // prints "A"

My integer literel type is now A and not any of the (U)Int(8,16,32,64) family.

--
Adrian Zubarev
Sent with Airmail

Am 22. November 2016 um 19:36:26, Kenny Leung via swift-users (swift-users@swift.org) schrieb:

Hi Marc.

My old mechanical engineering prof used to say, “C is easy if you know assembler”.

The fact that such a simple construct does not work and requires such a long explanation - which may still not be understood by a newbie - is a problem that should be addressed.

Even this explanation requires that you “see inside” the compiler to know what it’s “thinking”. And the fact that NSNumber comes into this makes it more interesting. What would be the behaviour (or at least the error message) on Linux, where there is no NSNumber? (or is there? I’m even more confused - have to try it out for myself).

We are also getting complacent when “A literal doesn’t have a type on its own. Instead, a literal is parsed as having infinite precision and Swift’s type inference attempts to infer a type for the literal.” gets condensed down to “literals in Swift are untyped” I don’t think this helps the explanation when there really is a distinction between different types of literals (otherwise there wouldn’t be things like ExpressibleBy*Boolean*Literal).

I think part of it is the way the documentation itself is worded. Another part here is the weird side effect Objective-C compatibility brings into the picture.

I think I’m turning this into a swift-evolution topic:
* should Int(Bool) be supported in the standard library?
** if so, then Int(t) should work here
** if not, then Int(true) should also error to avoid confusion

-Kenny

> On Nov 21, 2016, at 3:09 PM, Marco S Hyman via swift-users <swift-users@swift.org> wrote:
>
>> Except it does, because if I write
>>
>> let a = 2
>
>> a is of type Int (at least, according to Xcode's code completion).
>
> and if you write
>
> let b = 2 + 0.5
>
> 2 is treated as a double. The type of the literal “2” varies with context. Do you also find that inconsistent and confusing?
>
>> But this gives inconsistent results:
>>
>> let t = true
>>
>> let a = Int(true)
>> let b = Int(t) // Error
>>
>> I find this to be very inconsistent and confusing.
>
> t is a Bool and there is no automatic conversion from Bool to Int.
>
> true is not a Bool. It may be treated as a Bool depending upon context. In the line `let t = true` it is treated as a Bool. In `let a = Int(true)` it is treated as an NSNumber (assuming you import foundation).
>
> Marc
> _______________________________________________
> swift-users mailing list
> swift-users@swift.org
> https://lists.swift.org/mailman/listinfo/swift-users

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

extension Bool {

func convertToDouble() -> Double {

return Double( self == true ? 1 : 0)

}

}

Or just replace all the Double(s) with Int.

.xcplaygroundpage:9:40: error: cannot find type '_MaxBuiltinIntegerType' in scope
init(_builtinIntegerLiteral value: _MaxBuiltinIntegerType) {}
^~~~~~~~~~~~~~~~~~~~~~

In case you didn’t notice, the sample code in this thread is 6 years old, so it’s not surprising it doesn’t work.

2 Likes