I came across a very strange behavior in Swift lately.
Normally, when using a meta type, I can only reference its required initializer.
class A {
init(_ Val: Int)
}
let t = A.self
t.init(3) // error
This is expected.
However, if my type is a subclass of UIView, and I don’t add/override any designated initializer, the compiler will allow me to reference .init(), which is not marked with “required”.
class A: UIView {}
let t = A.self
t.init() // valid, but unexpected.
Even though this could leave my program in an unsafe state:
class B: A {
init(_ val: Int) { super.init(frame: .zero) }
required init?(_ coder: NSCoder) { fatalError() }
}
let t: A.Type = B.self
t.init() // crash.
So what is special about UIView.init(), that makes the compiler treat it differently, even somewhat dangerously?
The only thing that I can think of is that UIView is an Objective-C class that ultimately inherits from NSObject as part of Cocoa Touch. Inheriting from UIView tags your class an @objc class, not a normal Swift class. NSObject has a designated initializer init(), which gets called if no intervening derived class overrides it.
class C: NSObject {
// required init(x: Int) {
// print(x)
// }
}
class D: C {
// required init(x: Int) {
// super.init(x: x)
// }
}
let t = D.self
t.init() // Constructing an object of class type 'D' with a metatype value must use a 'required' initializer
I still wonder if there're any specific reasons why NSObject.init is special? Maybe because the compiler "assumes" every normal @objc type "should" support init?