Got you. I still wonder are you getting actual practical benefits using Decimal vs Double in your project? You know that you are not getting "infinite precision" when storing, say, 4/3, right?

Can you provide an example of Decimal -> Double -> Decimal conversion giving a different decimal than the original? This looks like a bug worth looking at.

We decided to support either Double or Decimal with:

typealias DataType = Double or Decimal

and write code that allow this change. I can write the same code for either types, except with @AppStorage, which I fix with your implementation of Decimal: RawRepresentable.

Take a look at this and this thread and tell me if I should worry.

It's not particularly hard if we use values outside of Double's range. It sounds contrived, but something as simple as 1 as Decimal / 3 would already get you there.

TBH, that is a very curious requirement. The fact that you are ok with doing Decimal -> Double -> Decimal to me sounds like you don't actually require Decimal precision.

I need to check these mentioned examples. Naïve thinking suggests that if you pick "the closest Double" when doing Decimal to Double conversion and then choose "the closest Decimal" when doing the reverse Double to Decimal conversion you should return back to the original number, but apparently it's not so rosy.

The OP didn't answer why they want that. I would assume this has something to do with, say, edit field feeding the value into the model, and if model hold Double your entered 1.23 could be changed to some 1.230000001 when the view is updated back from model, unless you do some rounding, which you want to avoid doing to simplify things. That's a speculation, I didn't have a first hand experience with Decimals to know when they are useful.

The main difference (aside from the apparent precision) is that Decimal uses decimal arithmetic, for lack of a better word, while Double uses floating-point arithmetic. 1.23 won't turn into 1.230000001 on its own. Such transformation usually is a result from some operations.

let x = 1.23
let y = x + 1.01
let z = y - 1.01
print(x, y, z, x == z)
// 1.23 2.24 1.2300000000000002 false

That's where Decimal would shine the most.

let x: Decimal = 1.23
let y = x + 1.01
let z = y - 1.01
print(x, y, z, x == z)
// 1.23 2.24 1.23 true