Note that you only need to do the #if dance if the library supports compiler versions that cannot parse the primary associated type syntax. So, if a library compatible with the Swift 5.6 compiler adopts primary associated types and then decides to add another primary associated type later, the library does not need to add another#if condition for the second primary associated type.
I would really like to see this handled in a way that doesn't require duplicating the entire protocol body. IMO that imposes a pretty high maintenance burden on library authors to keep the different versions in sync. But I'm also not responsible for maintaining such a library so perhaps I am overreacting?
Is there a way to specify just the second of two primary associated types? It seems a bit strange to me to impose a de facto hierarchy on the primary associated types based on order. (Would placeholder types allow this to 'just work' as some AsyncSequence<_, MyError>?)
As a maintainer of a library, I certainly wouldn't want to support such a bifurcation. I don't know if it's valuable enough to immediately drop older compiler support. Alamofire's ResponseSerializer could take advantage of it, but so few people use anything other than the built in serializers I don't know if it's that important to support quickly.
(If Apple wants us to jump to newer Swift versions faster, they need to support older macOS versions longer.)
That should just be done with the full where clause syntax. Remember that the primary associated types feature is not intended to replace where clauses entirely; you still need them for more complex requirement specifications.
This could be made to work as long as primary associated type constraints are only valid in generic requirement position, but it introduces an ambiguity as soon as we allow primary associated types constraints on any for the types of values; the placeholder means "infer this from context", not "leave this unspecified". That is,
let a: Array<_> = [1, 2, 3]
infers the type of a as Array<Int> from the expression, it doesn't erase the element type to give you a hypothetical <T> Array<T> existential. Similarly, you would expect that
let a: any Sequence<_> = [1, 2, 3]
would infer the type of a as any Sequence<Int>, not any Sequence with Element erased.
If it is any consolation, primary associated types do not require any runtime support nor do they introduce new ABI, so as long as you can use the new compiler you can still backward deploy code that uses the feature to older platform versions.
I don’t think it’s reasonable to accept minimizing #if complexity as an ongoing factor on language evolution. I’d be interested in knowing if there are other ways we can address this backward-compatible source library use case, though. In particular, when we’re printing module interfaces, we do have logic to emit #if conditions to allow the interfaces to be parsed by older tools. That logic isn’t perfect, but it might be a foundation for doing the same rewrite to arbitrary source. So we could have a tool that does a source-to-source translation and redundant emissions necessary to make code interpretable by older compilers. Of course, maintainers would then have to actually run that tool when packaging their library for distribution.
for the purposes of supporting source library maintainers who want to support generating versions of their libraries that work in older versions of the compiler. Basically, a new version of the compiler would compile the library into source that can be compiled by older compilers.
Of course, maintainers would then have to run that tool in order to publish versions of their library instead of just having clients check out a tag of their repository. And they would also want to test that the output actually worked on older tools, but that's presumably not a new requirement.
The advantage is that, assuming the tool works, you get to just write code to the latest version of the compiler without having to manually maintain redundant declarations or whatever other #if complexity is necessary to support older compilers. The disadvantages are that you need the tool to exist and you need a sort of compilation phase to distribute backward-compatible versions of your library.
It might have to be on the maintainers’ side, though. Depends on why you want to support older tools. If clients are willing to update secondary tools as long as they don’t have to update the compiler and thus potentially their own code, then yeah, a client-side build step is fine. Otherwise it’s just as much of an imposition. I don’t think a general downgrade tool is likely to be small enough that people could just check it into their source trees.
I see. But why not choose a simpler solution that does not require source rewriting and complex packaging solutions?
I would not advise looking for an SPM-only solution. There are other packaging systems in the eco-system, such as CocoaPods and Carthage, with many years of maturity, that SPM can not always fully replace. Opposing packaging systems is the last of our needs.
You're seeing the typealias Element inside the Array type, not the associated type in the protocol. Re-declaring a typealias with the same name as a (primary) associated type in the protocol should probably not be legal, because then it introduces an ambiguity in name lookup. The same scenario is not a problem for concrete types, because the generic parameter 'Element' on the right hand side of 'typealias Element = Element' is not semantically a member of Array; if it weren't for that typealias, I could not write 'Array.Element' to access the generic parameter.