Using the bare name of a generic type within itself

I would be very happy if someone put together a proposal for this. This is on my short list of breaking language changes I'd still like to see at some point:

  • Remove bare generic type behavior (this)
  • Remove implicit 'nil' initial value for Optional-typed variables
  • Remove AnyObject dynamic dispatch
  • Remove implicit conformance to Hashable for enums with no payload cases
  • Remove label-shuffling tuple conversions
11 Likes

That’s a nice list. It would be cool if these could happen eventually.

2 Likes

I may just be having an off day, but I'm not seeing the fundamental value of having any special inference behavior within generic types.

@jrose's motivating example (inability to name a parameter type when shadowed by a same-named inner generic parameter) seems like an uncommon case and one that's easily addressed with a typealias.

The motivation mentioned by @AlexanderM (the repetitiousness of having to fully-qualify generic return types) doesn't seem so terrible, either. Why should the notation for the return of a Self-typed generic be any different from the return of a generic type anywhere else in Swift? If you really can't stand to repeatedly name the generic parameter types, just create a typealias.

Rather than piling backup behaviors or special cases on top of the existing special case, it seems like the proper solution would be to eliminate the original exception.

Well, Jordan is most likely right about the type checker with respect to expressions, so perhaps the behavior should be removed entirely there.

However, I think that declarations within a generic type (eg. properties, methods, nested types, and so forth) should still be able to use the shorthand when specifying their own types (including the types of their arguments, return values, and generic parameters). It is convenient, reads well, and I don’t see any active harm to keeping it.

By contrast, the initializer behavior I described earlier does strike me as harmful, so I’d be happy to see it removed.

It was originally added because C++ has it and it's convenient for the purposes Nevin mentions, though of course C++ didn't support inference of type arguments at the time it was added to C++. I don't think it was ever intended as the only way to spell something. @Douglas_Gregor might remember if there was more to it than that.

1 Like

That’s pretty much it. It seemed like a nice convenience when I cargo-culture it years ago (we didn’t have the general Self at the time). I’d love to phase this out of the language.

7 Likes