Apparently we don't read into those the same way. I don't think this adds consistency (of choice), correctness, performance, and even composability to what we already have. Commonality and fluency may be arguable if you're coming from C family of language, but that'd be a stretch given Swift's neutral stance toward C (family).
The way I see it, if you come from university with some math courses under your belt. Never wrote any C and you start writing a Swift app where you need to AND two toggle buttons, then you search for a AND and you learn that AND is written &&.
One day you might need XOR, and then someone explains that you could just use != for that. Then you'd probably sit down and write out the truth table first to convince yourself that it actually works.
This feels slightly old fashioned to me, like an old-school C solution. Like some languages where booleans are just unsigned integers and 0 is false and everything else is true and you can use addition as OR and multiplication as AND (I guess?). Or if you use a single bit and ++ becomes a toggle. These are neat tricks but it somehow doesn't feel modern.
And I would consider it a lucky accident that these operators can replace a dedicated AND and OR. Clearly + was not designed with this in mind, it just happens to work.
This is probably a bit vague. Let me explain.
You'd now have !=
vs ^
with the same behaviour but different precedence and associativity. This is so subtle that I think not many would realize at first glance, and even less so will appreciate. It's not adding to the consistency of choice when you have two things whose differences usually won't matter.
For correctness, if you mess up !=
, you'll also mess up ^
... For performance, composability, they're essentially the same as !=
. ^
might be a little more composable since it'd have higher precedence and associativity. Though I'd say, it's not much.
That's hardly helping the case. You can apply this argument to anything, NAND, XNOR, etc. It's like you're omitting something you deemed apparent that I don't see. I'll guess it's the fact that only the operator for XOR has precedence (^
), which again points back to my response on this being based on the C family.
If you're doing enough boolean algebra to need a shorthand for XOR, you can always define it yourself. Needing to do arithmetic on individual boolean values in software is rather uncommon, so it's not much of a burden to force the user to provide those operations if they really need them. The big reason &
and |
don't exist for Bool
is to avoid subtle performance and correctness traps in common code, and having ^
without those operators would be weird.
PS. Using multiplication for AND and addition for OR isn't an artifact of low level languages treating comparison results as integers. It's one of the standard notations for boolean algebra.
Sure I'm not saying there are strictly practical issues, it's more about it not feeling modern. Just like I love that we have a dedicated Bool type, instead of using Int or something, even though we could absolutely get by without it, so I would like that carried all the way.
Again, I really don't understand this kind of statement. Clearly it's an artefact, it's not used in mathematics. In computers presumably multiplication existed first, then people realised that we can use it for AND if booleans are integers. Since it works and saves us from introducing a dedicated Bool type, it becomes standard notation. Not the other way around.
I'm not saying this isn't clever, useful, safe and widely recognized among veteran programmers, I'm just saying it doesn't feel very modern or Swifty.
No. The notation dates back to Boole, who invented Boolean Algebra. He borrowed mathematical notation, but it still predates computers by about 100 years.
I studied electrical engineering, designed a lot of digital logic circuits, and writing a truth table is not not a big burden if there anything you feel you need to double-check.
In this case though, you really don’t need one. XOR fundamentally means “not equal”. IMO, using !=
is actually a better spelling that makes more intuitive sense to more people.
It occurred to me that the following might be closer to OP's desire:
if Bool(Int(switch1) ^ Int(switch2) ^ Int(switch3) ^ Int(switch4)) {
isOn = true
} else {
isOn = false
}
But it requires the following extensions:
extension Int {
init(_ value: Bool) {
if value == false {
self = 0
} else {
self = 1
}
}
}
extension Bool {
init(_ value: Int) {
if value == 0 {
self = false
} else {
self = true
}
}
}