According to the documentation, the `ulp`

of `Decimal`

is

The unit in the last place of the decimal.

and the `ulp`

of eg `Double`

is

The unit in the last place of this value.

Some more examples of `Decimal`

's `ulp`

:

value |
ulp |

0.0001234567 |
0.0000000001 |

0.001234567 |
0.000000001 |

0.01234567 |
0.00000001 |

0.1234567 |
0.0000001 |

1.234567 |
0.000001 |

12.34567 |
0.00001 |

123.4567 |
0.0001 |

1234.567 |
0.001 |

12345.67 |
0.01 |

123456.7 |
0.1 |

1234567 |
1 |

12345670 |
10 |

123456700 |
100 |

Note that eg:

```
var x = Decimal(0.999)
print(x, x.ulp) // 0.999 0.001
x = x.nextUp
print(x, x.ulp) // 1 1
```

while for eg `Double`

(due to the difference in data format):

```
var x = Double(0.999)
print(x, x.ulp) // 0.999 1.1102230246251565e-16
x = x.nextUp
print(x, x.ulp) // 0.9990000000000001 1.1102230246251565e-16
```

Also note that for `Double`

the printed decimal values above (eg 0.999) are not the exact `Double`

values (they are kind of rounded to the decimal representation with as few decimal digits as possible while still being closer to the actual `Double`

value than to any other `Double`

value, or something like that).

I'm not sure what you mean by

the number of decimals in a decimal number

but if it is the number of digits in the fractional part then beware of things like:

```
let x = Decimal(1.234)
print(x) // 1.234
let y = x.nextUp
print(y) // 1.235
let z = Decimal(1.235)
print(z) // 1.2350000000000002048
print(y == z) // false (because the literal `1.235` is a Double in Swift)
let w = Decimal.init(string: "1.235", locale: Locale.init(identifier: "us"))!
print(w) // 1.235
print(y == w) // true
```