Curious what you mean? I was not aware that the addition operator is valid for sets in Swift:
let usefulValue = (["nope"] as Set) + (["nope"] as Set)
// Compiler error: Binary operator '+' cannot be applied to two 'Set<String>' operands.
let hmm = (["nope"] as Set) + "nope"
// Compiler error: Binary operator '+' cannot be applied to operands of type 'Set<String>' and 'String'.
Why to have Set Sugar via Compiler-enforced Immutable Set Literals
At any rate, the idea is that an immutable set of literals should be invalid if it's not a valid set, from a set theoretical standpoint. On a fundamental level, a set is defined as "a set of different things." That is the basis set theory.
And yet, presently in Swift, you cannot actually directly create a statically-defined set, which would be a necessary precondition to support unordered set-iadic generics.
Consider:
let set = [1,1,2] as Set
What's actually happening here is that the array [1,1,2]
is created ephemerally, then it is cast to a Set, i.e. the runtime will implicitly call Set([1,1,2])
.
One might argue that this makes perfect sense, but consider the following, which behaves exactly the same way, albeit in a much more deceptive manner:
let set: Set<Int> = [1,1,2]
I would argue that when you initialize an immutable stored property from a group of literals, the stored group should always have the same number of literal members as what was assigned to it, and if that's impossible then it should be a compiler error, to the extent all the members are literals that the compiler already knows the value of at compile-time.
Clearly the compiler is capable of checking whether it knows something or not, as evidenced by how opaque types work.
I think for any immutable set literal it should never have fewer members or more members at runtime after assignment than it had at compile-time, because that would indicate mutability or magic behaviors.
We should not let people store invalid literals that could very well be simple typos, in cases where the compiler can very easily validate this input in a meaningful way.
Suppose I had written:
let nuclearLaunchAbortValues: Set<Int> = [42, -42, 812, -812, 12, 12]
It would be very unfortunate if a nuclear war ensued because the Swift compiler could not at least place a warning on this obviously invalid set, an immutable value declared with 6 members but which will only have 5 members at runtime.
For set-iadic generics, one of the nice benefits would be to have a set of values mapping to the set of generic types, such that we are guaranteed one and only one value per type—but if two ints are supplied, the compiler needs to be smart enough to realize that's invalid.
This is why I'm suggesting set literal sugar:
let setWithThreeStrings = ~["firstLiteral", "secondLiteral", "thirdLiteral"]
let setWithThreeTypes = ~[Int.self, String.self, SwiftUISheetWithDetents.self]
This tells the compiler this is supposed to be an actual set. (It would not be valid for variables, for obvious reasons. However it could be used for set-iadics, and necessary also because I don't know of another way to represent a set within a parameter list that normally behaves like an array.