I might be out of my depth here, but I think conforming Decimal to ExpressibleByFloatLiteral perpetuates a poor design choice in the original type, which allows for things like these to happen:
let foo: Decimal = 3.133
print(foo) // 3.132999999999999488
Instead, we should either completely bar Decimal from being expressible by literal, or have Swift implement a native ExpressibleByDecimalLiteral.
This has been extensively discussed in other threads, like: