I don't mean to be condescending, but we once again circled back to the beginning without addressing what I asked... There is a theoretical confusion, but will it actually happen in code? Is there someone who really relies on
* for things to work, in code?
This whole discussion for me sounds like saying that computers are a bad invention because they will stop working if you explode them. Surely bad, but who the hell does that? That's what I think we should discover for
* before arguing about whether or not we should change the syntax. In the case of
*: who is using
* in a way that would make the functionality confusing?
I'm not against coming with a different solution, I'm against doing so with no concrete proof besides what we personally think. I think what I intend to achieve here is that ultimately we should have no expectations for
* in terms of functionality, because (I think) no one will ever rely on it. Thus, it would be present simply as a way to make it clear that other platforms are also being considered in the statement, just like for the original
i.e, If I'm correct about my claims, then we are overthinking an aspect of this feature that no one will every rely on. We made up this problem, it's not legitimate.