But it's in the context of a BinaryInteger which has its own semantics. I can write generic algorithms on BinaryInteger and have guarantees about what a default BinaryInteger is. On the other hand DefaultInit provides no information about the type.
So if we created a DefaultInit protocol like so:
protocol DefaultInit {
init()
}
then we make some generic function using it:
extension Sequence where Element: DefaultInit {
func reduce(_ f: (Element, Element) -> Element) -> Element {
return reduce(Element.init(), f)
}
}
now we have to conform Int to DefaultInit to use it:
extension Int : DefaultInit {} // (1)
// or you could also conform differently
extension Int : DefaultInit { // (2)
init() {
self = 1
}
}
finally lets use it:
[1, 2, 3].reduce(+) // 6 (1)
[1, 2, 3].reduce(+) // 7 (2)
[1, 2, 3].reduce(*) // 0 (1)
[1, 2, 3].reduce(*) // 6 (2)
As you can see the main problem is actually writing a generic algorithm that should work as expected. This is because there are no semantic guarantees that DefaultInit can provide.
To have some guarantees about the semantics of default values for an Int, regarding multiplication and addition, one could write the following two protocols.
protocol MultiplicativeIndentifiable {
/// creates an instance i such that i * x == x, for any other instance x of same type i
init()
}
protocol AdditiveIndentifiable {
/// creates an instance i such that i + x == x, for any other instance x of same type i
init()
}
We would then write algorithms against one of these protocols.