Pitch: I'd like to propose that we modify the behavior of the Sequence.max
algorithms to match the free max(...)
functions in cases where there is more than one "maximum" element.
Motivation
SR-12113 asks that we codify the behavior of the min
and max
methods on sequences into a guarantee in the documentation. Unfortunately, the current behavior of the max
methods is inconsistent with other places in the standard library that identify the maximum of two or more inputs, when there is more than one "maximum" value.
Specifically, the sequence method returns the first of several maximum elements in the sequence, while the other APIs (the max
free functions and the FloatingPoint.maximum
static methods) return the last of the maximum values. In general, this difference behavior isn't noticeable, but for class instances it's observable through instance identity, and when using the predicate-based method can be observed through whatever properties aren't included.
Here's a silly example to demonstrate:
// Every instance of `Comp` is "equal" but
// distinguishable by its `value` property.
struct Comp: Comparable {
var value: Int
init(_ v: Int) {
self.value = v
}
static func ==(lhs: Comp, rhs: Comp) -> Bool { true }
static func < (lhs: Comp, rhs: Comp) -> Bool { false }
}
let a = [Comp(1), Comp(2), Comp(3), Comp(4)]
// The sequence method returns the first element
a.max() // Comp(value: 1)
// The free function returns the _last_ element
max(a[0], a[1], a[2], a[3]) // Comp(value: 4)
Which is the correct behavior?
Alexander Stepanov discusses the ordering of min
and max
functions in lecture 7 of his notes (see pages 57-63, but particularly 61-63 here). In particular, he posits that min
and max
should provide the same stability that sorting two elements does, such that min(x, y)
should always equal the first element of sort_2(&x, &y)
, and that max(x, y)
should always equal the second element of sort_2(&x, &y)
.
Because the standard library's sort is stable (though not yet promised as such), we can extend this principle into Swift by stating the rule this way, for any collection a
:
a.min() == a.sorted().first
a.max() == a.sorted().last // *
// using `a...` to imply a variadic splat:
min(a...) == a.sorted().first
max(a...) == a.sorted().last
Proposed Solution
Swift currently fails the test marked by // *
. I would like to correct that inconsistency and document the behavior for all the sequence methods.