Hi,
Swift currently lacks a built-in secure memory abstraction similar to C#'s SecureString for handling sensitive data like passwords, API keys, and cryptographic secrets. This forces developers to either:
- Use Swift's
Stringtype - which is insecure due to copy-on-write optimization and potential string interning, leaving copies of sensitive data scattered in memory - Manually manage unsafe buffers - which is error-prone and requires a deep understanding of memory management.
I'm not an experienced Swift developer and perhaps I am missing something obvious, but I reviewed the SWIFT documentation, particularly memorysafety, and could not find the answer to my question.
Demonstration of the Problem
The following code demonstrates the security vulnerability:
import Foundation
print("Enter password: ", terminator: "")
if let cString = getpass("") {
let password = String(cString: cString)
print("\nPassword length: \(password.count)")
// Zero out the buffer after use
memset(UnsafeMutableRawPointer(mutating: cString), 0, strlen(cString))
print("Press Enter to exit...")
_ = readLine()
}
Memory analysis reveals the password string persists in memory even after explicit clearing attempts.
My Workaround
Developers must currently implement manual unsafe buffer management:
(Here should be an image which shows no leak, but as I am a new user I receive : An error occurred: Sorry, new users can only put one embedded media item in a post. so you have to trust me, or test it yourself
)
import Foundation
print("Enter password: ", terminator: "")
if let cString = getpass("") {
let length = strlen(cString)
// Allocate a buffer
let buffer = UnsafeMutablePointer<CChar>.allocate(capacity: length + 1)
buffer.initialize(from: cString, count: length + 1)
// ... use `buffer` for authentication, etc. ...
// Zero out the buffer after use
memset(buffer, 0, length + 1)
buffer.deallocate()
// Also zero out the original cString from getpass
memset(UnsafeMutableRawPointer(mutating: cString), 0, length)
print("Press Enter to exit...")
_ = readLine()
}
This approach is error-prone and requires specialized knowledge, creating barriers to secure development. Manual buffer management increases bug risk. I believe there should be an API for that.
Conclusion
This has a global impact on Apple's application security ecosystem. I have tested applications written in Swift, all of which leak credentials, API keys, and other sensitive data. macOS provides default protection against extracting sensitive data left in memory through a hardened runtime feature that prevents task_for_pid(). However, developers do not always enable the hardened runtime for their applications. Furthermore, this protection is not foolproof and can be bypassed or misconfigured. Additionally, the app's memory can also be stored in core dumps, swap files, hibernation files, and accessed by security scanners with elevated permissions. Currently, it seems that there are mitigations in place that make it harder to read the process's memory, but there are no solutions for securely storing and clearing sensitive data in memory.
