Explicitly define generic type in struct declaration

I'm trying to create a Counter class in Swift like the collections.Counter class in Python, which works like this: collections.Counter() | HackerRank

Or, as a simple example:

>>> Counter("hello")
Counter({'h': 1, 'e': 1, 'l': 2, 'o': 1})
>>> Counter([1,2,3,2])
Counter({1: 1, 2: 2, 3:1})

Here's what I got in Swift:

import Foundation

struct Counter<T: Hashable>: CustomStringConvertible {
    var description: String {
        return dict.description
    }
    
    var dict: [T: Int] = [:]
    
    init(_ val: String) {
        for char in val {
            self.dict[String(char), default: 0] += 1
        }
    }
    
    init(_ val: Array<T>) {
        for elem in val {
            self.dict[elem, default: 0] += 1
        }
    }
    
    func elements() -> [T] {
        return Array(dict.keys)
    }
    
    mutating func clear() {
        dict = [:]
    }
}

The issue is it says, on the String init that

Cannot convert value of type 'String.Element' (aka 'Character') to expected argument type 'T'

This means that while this class works just like the Python one when implementing with arrays, it looks clunky with Strings. I can edit it so that:

    init(_ val: String) {
        for char in val {
            self.dict[String(char) as! T, default: 0] += 1
        }
    }

But now, to use the Counter with a String, I must do:

// Without the <String>, it says "Generic parameter 'T' could not be inferred"
print(Counter<String>("hello"))
print(Counter([1,2,3,2]))

Is there a way to explicitly define T as a String in the init instead of during every usage, so that I can use Counter("hello")?

Yes. You can constrain that initializer to only be available when T == String:

    init(_ val: String) where T == String {
        for char in val {
            self.dict[String(char), default: 0] += 1
        }
    }
2 Likes

Awesome, thank you!

1 Like

Another option would be the following:

init<S: Sequence>(_ sequence: S) where S.Element == T {
    for elem in sequence {
        self.dict[elem, default: 0] += 1
    }
}

This accepts both Array and String (and other sequences) without the need to specify the type.

1 Like

Yes, but note that your code has different semantics than @winstonp's code. The original code creates a dictionary of type [String: Int] when initialized with a string, whereas your solution creates a dictionary of type [Character: Int].

(I actually prefer the latter because it's more consistent to use Character as the key type, but maybe @winstonp has specific requirements for it to be a String).

1 Like

Good point, I didn’t notice that!

1 Like