let n = 0xdeadface
print("n =", n.formatted(.number.how-to-do-upper-case-hex)) // I want: n = 0xDEADFACE
// same as this:
print("n =", String(format:"0x%lX", n))
I'm just not able to find the right method
let n = 0xdeadface
print("n =", n.formatted(.number.how-to-do-upper-case-hex)) // I want: n = 0xDEADFACE
// same as this:
print("n =", String(format:"0x%lX", n))
I'm just not able to find the right method
This should do it:
"0x\(String(n, radix: 16, uppercase: true))"
I want to use the newly introduced FormatStyle, something like this but output in hex:
SwiftUI.Text(12345, format: .number.how_to_be_in_hex?)
It should be somewhere in IntFormatStyle. I search hex, radix, cannot find what I want. Maybe it's not something supported, as NumberFormatter cannot do it seems and IntFormatStyle
is probably based on that internally.
Edit: so if these isn't hex FormatStyle
already, maybe something like this:
struct HexStyle<Subject: BinaryInteger>: FormatStyle {
let prefix: String
let uppercase: Bool
func format(_ value: Subject) -> String {
"\(prefix)\(String(value, radix: 16, uppercase: uppercase))"
}
}
let number = 0xface
number.formatted(HexStyle(prefix: "0x", uppercase: true)) // 0xFACE
// how to do this? this doesn't compile:
extension FormatStyle where Self == HexStyle<Subject: BinaryInteger> {
static func hex<Subject>(prefix: String = "0x", uppercase: Bool = true) -> HexStyle<Subject> { .init(prefix: prefix, uppercase: uppercase) }
}
// so can do this:
number.formatted(.hex())
Can it work through IntFormatStyle
:
number.formatted(.number.hex())
?
I see. Looking at the Foundation API, they seem to have a dedicated extension for every concrete integer type. The Foundation module interface looks like this (excerpt):
@available(macOS 12.0, iOS 15.0, tvOS 15.0, watchOS 8.0, *)
extension FormatStyle where Self == IntFormatStyle<Int> {
public static var number: IntFormatStyle<Int> { get }
}
@available(macOS 12.0, iOS 15.0, tvOS 15.0, watchOS 8.0, *)
extension FormatStyle where Self == IntFormatStyle<Int16> {
public static var number: IntFormatStyle<Int16> { get }
}
@available(macOS 12.0, iOS 15.0, tvOS 15.0, watchOS 8.0, *)
extension FormatStyle where Self == IntFormatStyle<Int32> {
public static var number: IntFormatStyle<Int32> { get }
}
@available(macOS 12.0, iOS 15.0, tvOS 15.0, watchOS 8.0, *)
extension FormatStyle where Self == IntFormatStyle<Int64> {
public static var number: IntFormatStyle<Int64> { get }
}
// and so on
So it seems this is the way to do it currently. The missing Swift feature here is called parameterized extensions in the Generics manifesto.
Just a quick note about the big picture here: The new number formatting facilities are conceptually a wrapper around the existing NumberFormatter
class. NumberFormatter
is focused on localised strings, which is why NumberFormatter.Style
has no .hexadecimal
value. In contrast, String.init(_:radix::uppercase:)
produces a non-localised string. To my mind trying to mix these two seems like a type error.
Share and Enjoy
Quinn “The Eskimo!” @ DTS @ Apple
thanks for the awesome information.