Whenever I see code that asks what the native byte order is, it's almost certain the code is either wrong or misguided. [...]
The byte order of the computer doesn't matter much at all except to compiler writers and the like, who fuss over allocation of bytes of memory mapped to register pieces. Chances are you're not a compiler writer, so the computer's byte order shouldn't matter to you one bit.
Notice the phrase "computer's byte order". What does matter is the byte order of a peripheral or encoded data stream, but--and this is the key point--the byte order of the computer doing the processing is irrelevant to the processing of the data itself. If the data stream encodes values with byte order B, then the algorithm to decode the value on computer with byte order C should be about B, not about the relationship between B and C.
Rob Pike - The byte order fallacy
The current APIs offered on Swift's fixed-width integer types help you focus on the endianness of data. If your data contains a big-endian Int32, you call Int32(bigEndian: value) and it gives you the correct numeric value on every machine. Similarly, if you need to produce a big-endian Int32, value.bigEndian.
Other than querying the machine's native endianness, I'm not sure what additional value a ByteOrder type would offer over the existing APIs.