I can see your point. You are using an array initializer:
Array(...) == Array(...)
and depending upon the initializer either getting a consistent "true":
Array(arrayLiteral: 1, 2, 3) == Array(arrayLiteral: 1, 2, 3)
or inconsistent random result:
Array(Set([1, 2, 3])) == Array(Set([1, 2, 3]))
It's easy to foresee bugs when someone refactors the code:
let s = Set([1, 2, 3])
Array(s) == Array(s)
or goes from the latter to the former form, changing the behaviour along the way potentially without realising it. Or just without realising the order is non deterministic.
What you can easily do is define this extension:
extension Array where Element: Hashable {
init(nondeterministicOrder elements: Set<Element>) {
self.init(elements)
}
}
and use it consistently throughout your code base, along with agreeing with your team members (if any), to use it instead of the original. It won't be possible (without modifying swift interface files, or, perhaps, some custom swift lint rules?) marking the original initializer deprecated, so some discipline would be required.
Another option would be to have a two-in-one initializer of the form:
init(_ elements: Set<Element>, sort: (Element, Element) -> Bool)
with a non optional "sort" parameter and some explicit opt-in option for the current non-deterministic behaviour:
Array(Set([1, 2, 3]), sort: {$0 < $1}) // with sorting
Array(Set([1, 2, 3]), sort: .nonDeterministic) // without sorting
Array(Set([1, 2, 3])) // Ideally warning or Error here: obsolete, set order explicitly