Is it possible to create `and` and `or` extension for `Bool` type?

.& - for what types?

I firmly believe it was motivated by the fact that the PDP-11 has an instruction to jump-if-not-zero: BNE. See the processor handbook, page 4-36. All modern CPUs have an equivalent.

SIMDMask

You are conflating two different issues.

  1. Whether bitwise and boolean operators can use the same symbols without confusion.
  2. Whether there is a semantic difference between the two.

I am claiming there is a semantic difference, and therefore, even if there would be no syntactic confusion by reusing the symbols, there would be a conflation of semantics that are better left separated. The latter is my opinion.

this is casuistics. by the same logic you can say that there's semantic difference between Int+ and Float+ as the former can trap on overflow (in swift), or wrap around (when in C or in swift using the &+ variant), and the latter doesn't.

1 Like

&& operators do not short circuit themselves.... it's only:

func && (a: Bool, b: @autoclosure () -> Bool) -> Bool

that does. and other similar "func &&" implementations that do the same.

it is perfectly legal to have

func && (a: SomeType, b: SomeType) -> ...

that doesn't short circuit. if that was a "no-no" it would be enforced at the language level and the short circuiting behaviour would be the feature of the operator (somehow encoded in operator declaration) and it isn't.

that explains false=0 but doesn't explain true=1 though :slight_smile:

It explains both false == 0 and true == !0. That's the point -- the fact that any non-zero value is considered true is an accident of the architecture for which C was designed. There are languages which have a much more constrained idea of what value is considered true.

If we were to overload && or & to mean the other, would you do the same for ! and ~?

definitely. actually i did this a few days ago in my codebase (! changed to ~ and != changed to <>), so now each and every "!" symbol means something unsafe (optional unwrapping or as! cast).

to be clear, i am not proposing at this date and age to unify & and &&. it is too late now. 5 years ago that would be possible, today the language is way too stable for that.

Surely, but my point, that precedence plays a role and that & and && are different in that regard, is still valid?

PS,
As I said before, the fact that it isn’t illegal doesn’t make it a good idea. We can probably all agree that _some_name_ is not a Swift-y identifier for anything.

1 Like

yep. that actually stopped me from changing && to &.

my claim was that the difference in precedence is not fundamental: you are not getting a completely different language if you change the & precedence to be that of &&. for example in C both && and & were of a lower precedence compared to == (this is not the case in swift), and those who used C might not even remember that.

are you aware of any language guideline on docs.swift.org that states "always short-circuit &&" and "never short circuit &"? otherwise why do you think it is a bad idea?

IMHO things like these shall not even be guidelines... if it is really bad (as you imply it is) it must be enforced in the language, e.g. this:

operator && : LogicalConjunctionPrecedence, AlwaysShortCircuits

operator & : MultiplicationPrecedence, NeverShortCircuits

func && (a: SomeType, b: SomeType) -> ... // error: shall short circuit

and if it is not enforced in the language --> it is not bad.

good luck with that. i, for one, think that non ascii characters in identifiers are bad and should be ditched from the language... ten developers -> twelve opinions.

It’s not bad per se, but it already precedence and it is one thing (amongst others) ppl who writes foo() && bar() would expect.

Embrace precedent. Don’t optimize terms for the total beginner at the expense of conformance to existing culture.

While it’s not optimize for beginner, to me, it still breaks away from existing culture and at least feels unjustified.

Nonetheless, short-circuiting is never part of the language. Just some convention that has already been around.

Heh :stuck_out_tongue:

and then they remove ++ because in particular "These operators increase the burden to learn Swift as a first programming language" and despite in particular "People coming to Swift from these other languages may reasonably expect these operators to exist" :slight_smile:

i think it is part of the language, as it is defined in the language doc in regards to && || and ternary operations.

let me give you an example: you are implementing a vector by number multiplication and for some reason you want it to short-circuit the execution of the first operand:

func * (a: @autoclosure () -> Vector, b: Double) -> Vector {
return b.abs > eps ? a.mult(b) : .zero
}

would that be perceived as a mistake, or "breaking a convention" by the users of that operator? i very much doubt so.

If it ends up with oops, we put it in the wrong place, just use parentheses, then that defeats the sole purpose of precedence.

Surely, then they come with the justification. It's always about trade-off. And that's why it's a guideline, not a language restriction.
To me breaking convention so we can reuse the symbol doesn't seem like a good one.

That'd depend on the common usage of the operator. If resultant of each part of operator usually have side-effects, or is costly, then that does come into play when designing (as && generally the combining operator of 2 functions with side-effect potential).

So I agree, that * may not be exactly breaking convention, or rather there is little impact from the breaking, if the usage usually surrounds pure functions. The same I can still not agree with &&.

the & precedence in C was "oops".. not the end of the world though.

that was quite annoying to me to extent that i introduced my own ++

it's too late of course but i'd prefer it to be more explicit rather than by convention. pseudo code:

a && b // error, right side is short-circuiting
a && {b} // ok
a * {b} // error, right side is not short-circuitting
a * b // ok

in the vector by number example above that would be:

v * n // error, left side is short-circuitting
{v} * n // ok

that way it is clear on the caller side what's going on, and can be further extended to other use cases:

{a} && b // compiler: "ok sir, i know what you mean here, you want me to evaluate b first and see how it goes"

Yeah, still, not as good as it could be. No reason to fall into the same trap if you see it beforehand.

Perhaps, but short-circuiting is a much more rare in the language at large. Only a handful of things like

  • (||, true), (|, -1)
  • (&&, false), (&, 0)
  • (*, 0)
  • (anything, NaN)

has potential for short-circuiting. Even then, so far, only ||, && is common enough for one to bother checking. Nonetheless, making a non-autoclosure version would do pretty much what you're writing.

yep, they are the only ones which do, plus ?? and ternary

the non-autoclosure version will just call all arguments before entering the function.

another way to "make conventions the rules" is something along these lines:

operator && (normal, autoclosure) : LogicalConjunctionPrecedence

What do you mean? non-autoclosure version will just have the same semantic as the autoclosure one, only that you need to add the bracket yourself.