I'm trying to use Codable to checkpoint the variables mutated while training a machine learning model.
I'm running into a limitation that I would prefer a model to only define checkpointing partially and have a
mutating func load(from: decoder: Decoder) throws function that loads from a checkpoint. This is because there is a part of the model that is const after construction and a part that is mutated during training.
Generally, what I'm interested in is a notion of PartiallyCodable which serializes out just a part of the data, and mutates in-place a pre-constructed prototype using data from a decoder.
This should be useful in any process where you have a relatively cheap initialization step, and an expensive but restricted update step.
Init: (Config) -> Model
Train: (Model) -> Model
Here, Init would be evaluated again in a program that is restoring from the checkpoint, and the effect of Train would be recovered from a file.
My question is, are there other circumstances that would benefit from this general notion of "PartiallyCodable"?