"Computing the number of digits of an integer even faster": https://twitter.com/lemire/status/1400533024488443906
His implementation in C:
int int_log2(uint32_t x) { return 31 - __builtin_clz(x | 1); }
int fast_digit_count(uint32_t x) {
static uint64_t table[] = {
4294967296, 8589934582, 8589934582, 8589934582, 12884901788,
12884901788, 12884901788, 17179868184, 17179868184, 17179868184,
21474826480, 21474826480, 21474826480, 21474826480, 25769703776,
25769703776, 25769703776, 30063771072, 30063771072, 30063771072,
34349738368, 34349738368, 34349738368, 34349738368, 38554705664,
38554705664, 38554705664, 41949672960, 41949672960, 41949672960,
42949672960, 42949672960};
return (x + table[int_log2(x)]) >> 32;
}
I translate that into Swift:
extension UInt32 {
func log2() -> Int {
bitWidth - 1 - (self | 1).leadingZeroBitCount
}
var digits: Int {
// Lookup table can only support up to UInt32, not 64bit
let table: [UInt64] = [
4294967296, 8589934582, 8589934582, 8589934582, 12884901788,
12884901788, 12884901788, 17179868184, 17179868184, 17179868184,
21474826480, 21474826480, 21474826480, 21474826480, 25769703776,
25769703776, 25769703776, 30063771072, 30063771072, 30063771072,
34349738368, 34349738368, 34349738368, 34349738368, 38554705664,
38554705664, 38554705664, 41949672960, 41949672960, 41949672960,
42949672960, 42949672960
]
return Int((UInt64(self) + table[log2()]) >> 32)
}
}
I want to know: did I do it right?
In Swift, what can be done to maximize performance (such as adding @inline)?
BTW, the (self | 1)
in log2()
is to avoid getting undefined value for zero.
Edit: I just realize his code only works for 32bit Int, anyway, ignore this error for my question....