I agree to some extent, but AFAICS, the goal of a protocol's semantic requirements is to prevent weird situations. And Sequence
is a a protocol that makes it exceptionally easy to introduce weird situations accidentally, without breaking any of its semantic requirements.
As @dabrahams writes in the old thread:
I should be clear up-front about the main problem I'm trying to solve:
Today, people see a beautiful, simple protocol (Sequence) to which many things conform. They don't recognize that there's a semantic restriction (assume you can make only a single-pass!) on it, so they write libraries of functions that may make multiple passes over Sequences. They test their libraries with the most commonly-available Sequences, e.g. Arrays and Ranges, which happen to be multi-pass. Their tests pass! But their constraints are wrong, their whole model of how to write generic code over sequences is wrong, and some of their code is wrong.
IMO this is a problematic programming model.