So all the deprecated initializers say to use this signature, but it's not working for me? Is it my set up?
- swift/stdlib/public/core/StringProtocol.swift at 171dadc959b7538fc9e401d8fdeffa80e03b310b · swiftlang/swift · GitHub
- swift-http-types/Sources/HTTPTypes/ISOLatin1String.swift at 97c96755ae30dd9a7c594c830a3334986cc0e25a · apple/swift-http-types · GitHub
So I am bringing some bytes in from C that are supposed to be Latin1 string, and I'd like to feed them to the ISOLatin1String type in HTTPTypes, but that type does not take a [CChar], but a [UInt8], partially because the decoder it uses doesn't take those? A different reason?
I tried to find all the undeprecated String core initializers I could work with and this is what I found (copying ones for now, although that's not the best for sure...)
(6.2, MacOS 15, can check next week if still the same behavior on MacOS 26, embedded CMake based compile with espressif sdk bringing in swiftUnicodeDataTables, but cross checked with an XCode 26 playground. Notes in the comments.)
// let response:[Int8] = [65,66,67,68,0]
// let response:[CChar] = [65,66,67,68,0]
let response:[UInt8] = [65,66,67,68,0]
//takes all three, if they are null terminated, but is deprecated
let message1 = String(cString: response)
//does the decoding style replacement with `"\u{FFFD}"`
//takes everything in VScode. XCode 26 playground tries to make me use deprecated
//signature with Unicode casting if use anything but UInt8. I can see in the
//interface the undeprecated one?
//Requires null termination, which these can be, but buffers end where buffers end and it's a step...
let message2 = String(decodingCString: response, as: UTF8.self)
//from HTTPTypes. Only takes UInt8
//let message3 = ISOLatin1String(response)
//Only takes UInt8
let message4 = String(decoding: response, as: UTF8.self)
//Takes all 3 but can fail / validating. Would prefer decoding.
let message5 = String(validating: response, as: UTF8.self)
It's not that hard to work around, (I'll just use a UInt8 buffer) but it seems weird, because while the declaration for decoding
looks different than validating
, it's generic and it looks like it should work? Is it my set up?
(I have not yet tried any of these with inline array or span, as some of them seem Collection based I'm assuming that doesn't work?)
I've read a bunch of the past threads and the documentation to see if I could get if there was something I could do to get decoding to take a [CChar], but I apparently not?. Thanks for all the hard work on this topic.
- [Pitch] String Input-validating Initializers
- https://forums.swift.org/t/init-string-from-a-non-null-terminated-c-char-array/
- https://forums.swift.org/t/idea-bytes-literal/
- SE-0405: String Initializers with Encoding Validation
- [Accepted with modifications] SE-0405: String Initializers with Encoding Validation
- String.Encoding | Apple Developer Documentation
- init(decoding:) | Apple Developer Documentation
ETA: forgot 2
//Only UTF8
let message6 = String(bytes: response, encoding: .utf8)
//Requires casting?? But at least its not deprecated.
let message7 = String.decodeCString(response as! [UTF8.CodeUnit], as: UTF8.self, repairingInvalidCodeUnits: true)