[Pitch] Improved Compiler Support for Large Homogenous Tuples

If there are nested tuples, would "large enough N" be computed based on a single dimension, or based on the total number of elements after expansion? If it's only per-dimension, then it would always be possible to nest tuples to some degree just under that limit and end up with poor compile time and code size characteristics.

I can imagine multi-dimensional homogeneous tuples (or even large single-dimensional ones) being used in high-performance mathematical/scientific computing applications to avoid heap allocations associated with arrays, and if the generated code for those is huge in non-optimized builds and expansion of the type representation causes compiler performance issues, then that's going to be a big obstacle for adoption (and a very confusing class of issues to debug/work around).

So it's hard to imagine a scenario where some ABI breaks aren't on the table to fix this. Otherwise, if the current ABI has to stay unchanged, I think introducing this homogeneous tuple syntax would do more harm than good and fixed-length arrays as a new concept should be created instead—but could imported C arrays be migrated over to use those without raising similar compatibility concerns?