i think the easiest way to explain why this doesn’t make sense (for BSON) is to point out that BSON is dynamically-typed.
so if you have data that looks like:
let _:(x:Bool, y:Int32, z:Int64) =
(
true,
0xAAAA_AAAA,
0xBBBB_BBBB_CCCC_CCCC,
)
it corresponds to BSON that can be thought of (in “swift terms”) as:
let _:[Any] =
[
Bool.self,
"x",
true,
Int32.self,
"y",
0xAAAA_AAAA,
Int64.self,
"z",
0xBBBB_BBBB_CCCC_CCCC,
]
if you instead try to encode the Int64 in pieces, such as:
let _:(x:Bool, y:Int32, z:Int32, _:Int32) =
(
true,
0xAAAA_AAAA,
0xCCCC_CCCC,
0xBBBB_BBBB,
)
you don’t get the same BSON, you get:
let _:[Any] =
[
Bool.self,
"x",
true,
Int32.self,
"y",
0xAAAA_AAAA,
Int32.self,
"z",
0xCCCC_CCCC,
Int32.self,
"",
0xBBBB_BBBB,
]
it’s essentially the same reason why this makes sense:
let x:(Int32, Int32) = (1, 2)
let _:Int64 = .init(littleEndian: unsafeBitcast(x, to: Int64.self))
but this doesn’t:
let x:(Any, Any) = (1 as Int32, 2 as Int32)
let _:Int64 = .init(littleEndian: unsafeBitcast(x, to: Int64.self))
does that help?