I'm confused about the compiler's type inference of generics within Swift (and hence the intended utility of generics), as I'm unable to efficiently resolve a use case I would expect to be commonplace.

The code file below demonstrates my issue:

import Foundation

class Playground {}

class Home: Playground
{
    static let precision = Double( 1000 )
}

class Garden: Playground
{
    static let precision = Double( 10000 )
}

class Other: Playground {}

struct Num<T:Playground>
{
    typealias Base = Double
    let precision: Base?
    let original: Base
    let value: Base

    init( _ value: Base, _ precision: Base? = nil )
    {
        self.original = value
        self.precision = precision
        if let precision = precision
        {
            self.value = round( value * precision ) / precision
        }
        else
        {
            self.value = value
        }
    }

    init( _ value: Base ) where T == Home
    {
        self.init( value, Home.precision )
    }

    init( _ value: Base ) where T == Garden
    {
        self.init( value, Garden.precision )
    }

    static func + ( _ lhs: Num<T>, _ rhs: Num<T> ) -> Num<T>
    {
        return Num<T>( lhs.value + rhs.value )
    }

    static func + ( _ lhs: Num<T>, _ rhs: Num<T> ) -> Num<T> where T == Garden
    {
        return Num<T>( lhs.value + rhs.value )
    }

}

struct test
{

    init()
    {
        let home1 = Num<Home>( 1.0 )
        let home2 = Num<Home>( 2.0 )
        let oops = home1 + home2
        
        let garden1 = Num<Garden>( 1.0 )
        let garden2 = Num<Garden>( 2.0 )
        let eureka = garden1 + garden2
    }

}

The console debug output of test init() is show below:

(lldb) p eureka
(Testing.Num<Testing.Garden>) $R0 = (precision = 10000, original = 3, value = 3)
(lldb) p oops
(Testing.Num<Testing.Home>) $R2 = {
  precision = nil
  original = 3
  value = 3
}
(lldb) 

eureka has the correct precision (10000): the compiler utilised the explicit types of Num<Garden> to choose the specialised + (where T == Garden), and then used the known type (T == Garden) to select the specialised init where T == Garden and apply the precision.

oops has a precision of nil: the compiler utilised the explicit types of Num<Home> to choose the unconstrained + (since no other T == constraints matched for +), but then selected the unconstrained init instead of utilising the fact that for this path T == Home is known.

I expected oops to be resolved at compile time to utilise the correct specialised type init, and yield the correct precision.

This can be fixed by introducing a constrained version of + for the case T == Home, after which oops will have the correct precision. However that is an unusable pattern, requiring every subtype of T be explicitly catered for in every init (and every related function that eventually propagates to init - a combinatoric mess), entirely defeating the utility of generics.

What am I missing - is this a known constraint of Swift, or is this not the Swift way to handle this pattern?

Inside the generic + function, the only thing known about T is that it is a subclass of Playground. The compiler can't propagate the fact that T == home to the initializer, because it doesn't compile separate instantiations of functions like C++ does with templates, it compiles one version that handles all valid types. There's only two + functions, one for Gardens and one for everything else.

I think the easiest solution would be to make precision a class variable on Playground and override it in the subclasses.

1 Like

Thanks for your feedback.

Could you expand on your proposed solution - are you suggesting a runtime decision?

This workaround restores sanity - thank you.