You can downcast to a specific enum case type or the enum:
enum E: Comparable {
case a(String)
}
func testE() {
let c: any Comparable = E.a("Hello")
if case E.a(let s) = c {
print(s)
}
if let e = c as? E {
switch e {
case .a(let s): print(s)
}
}
}
See e.g., https://goshdarnifcaseletsyntax.com
More generally, there was an interesting article posted on point that seems to track your intent:
https://swiftology.io/articles/tydd-part-2/)
(related to another forum post: https://forums.swift.org/t/se-0427-noncopyable-generics/70525/146)
The gist is that rather than merely validating, you "parse" to produce new type wrapping what you learned, in this case going from Data to some other Result type, with the type capturing what you learned to avoid re-learning it. The type can be used as a wrapper, or some co-located token, but it would enable you to encode requirements in your API, making any mistake a compile-time error.
That would suggest your ExampleResult
protocol should instead be your enum. Assuming you peek at the data or use out-of-band information for the type, you'd construct different enum instances for the different known types (and perhaps a fall-back for unknown), and refer to the yet-unprocessed Data (in a way that avoids copying?). If the data is in an enum associated value type, then that type has to be known to the module defining the enum.
If NetworkResult indeed just ferries Data, it's not clear it helps. Then you'd end up with no protocols, just the enum. Your input- and output-parsing could be distributed among many components even in different modules, but they would all be able to see all the enum cases, and you'd need a new release of the enum to support new types of data explicitly.
Another approach when parsing is to have different modules contribute handlers that can peek and data and either handle it or demur. Then all the validation and down-casting happens in the contributing modules, and the vectoring module is relatively oblivious. Merely handling would be a one-step dance, but you can also have multi-stage process with recognizers producing tokens used to coordinate and schedule work (e.g., buffering).