Use Swift's type system to restrict values

I would like to restrict a function input to the values of 1, 2, or 3. In the example below, the doSomething() function throws an error if the scale input is not 1, 2, or 3. But is there a way to handle this without having to define an enum for the error?

Is it possible to use Swift's type system to restrict scale so that the function will not even compile if other numbers are used? For example, if I write doSomething(x: 2.0, y: 5.0, scale: "one") then Xcode will not build the code because it expects an Int not a String for scale. Similarly, how can I prevent Xcode from building the code if scale is not 1, 2, or 3? Doing this would prevent the user from writing the wrong code to begin with.

enum ScaleError: Error {
    case outOfRange
}

func doSomething(x: Float, y: Float, scale: Int) throws -> Float {
    if scale != 1, scale != 2, scale != 3 {
        throw ScaleError.outOfRange
    }
    let z = (x + y) * Float(scale)
    return z
}

let result = try doSomething(x: 2.0, y: 5.0, scale: 5)
print("The result is \(result)")

One way to do it is to use an enum for the parameter type:

enum Scale: Float {
  case one = 1
  case two = 2
  case three = 3
}

func doSomething(x: Float, y: Float, scale: Scale) -> Float {
 return (x + y) * scale.rawValue
}

In places where you have an arbitrary number value that you need to validate, you can use Scale(rawValue: inputValue) which will return nil if that value is not one of the allowed values as well.

4 Likes

No, we don't have such "interval types" in Swift to be able writing e.g.:

typealias T = 1.0 ... 3.0
let x: T = 3 // โœ…
let y: T = 4 // ๐Ÿ›‘

Maybe it's possible with macros?

Yeah, but that won't work if you want to limit the floating range to all floating point values between 1 and 3.

1 Like

A macro might make it tidier, but nothing's stopping you from making your own type with an initializer that rejects values outside of a certain range.

As stated, the problem was to restrict the integer scale value to be one of 1, 2, or 3, and enums are a great fit for when you have a limited number of valid values. If you want to do some more elaborate validation, you can define a wrapper type that does that validation in its initializer:

struct Scale {
  private(set) var value: Float

  init?(value: Float) {
    if value < 1.0 || value > 3.0 { return nil }
    self.value = value
  }
}
3 Likes

Right, although the gist of the OP question was to make it "not even compile if other numbers are used". To make this work with Floats (yes, I remember that OP's was asking about Int's but we could generalise it to floats) the only ways to go would be macros or language/compiler change.

Interval types are not unprecedented in other languages (Pascal had it IIRC).

Says who? This example, also thanks to @Joe_Groff, shows that floating point numbers can be represented as type parameters in a generic signature: swift/test/Generics/rdar33654588.swift at main ยท apple/swift ยท GitHub

Implementing interval arithmetic is left as an exercise for the reader...

5 Likes

Sorry, I don't understand the linked example. How would you use it to make "foo(_ scale: Double)" accepting values between 1 and 3 at compile time?

I like the suggestion about using a Scale enum but it doesn't enable the compiler to enforce the desired values for the scale input parameter. I think what @Slava_Pestov is suggesting would be to create a protocol that would enforce the restricted values for the scale parameter.

I'm not sure what you mean. It is impossible for Scale to be anything other than a valid value.

4 Likes

enforce the restricted values for the scale parameter

Can you share the code snippet that shows Scale? I am looking for those ways that you have tried to represent Scale so far.

I'm using the following based on the previous suggestion:

enum Scale: Int {
  case one = 1
  case two
  case three
}

And how about the base case where we could pass a case value that is not 1 or 2 or 3? Is it like this?

enum Scale: Int {
  case one = 1
  case two
  case three
  case four
}

Without the accepted cases being describable in the method signature as belonging to their own type, I don't think that the compiler would be able to represent a subset of cases using the whole type which includes a 'restricted' case. When the compiler resolves the method signature it will verify the case and to do that it has to look at the type. In this case the type is Scale and it includes more than just case one, case two, case three.

So I think that what you might maybe want is a nested enum that exclusively describes only case one, case two, case three. I can't recall if a protocol will work on a casewise basis. Once those accepted cases are constrained to a type then that's what can go in the method sig.

So basically I think making a compile time promise that the param is within the accepted cases requires those cases living in their own type and that type being used in the method sig.

enum Scale: Int {
    enum Accepted: Int {
        case one = 1
        case two = 2
        case three = 3
    }
    case four
}
public class ScaleTests: XCTestCase {
    func doThing(x: Int, y: Int, val: Scale.Accepted) {
        print(x + y + val.rawValue)
    }
    
    func test_the_thing() {
        doThing(x: 1,y: 2, val: .four) //Type 'Scale.Accepted' has no member 'four'
    }
}

Note: I am not a language/compiler expert and this is just some armchair analysis.

Funny how nobody remembers it anymore. I loved this Pascal feature and used it a lot too. Pascal's interval types were also used in array declarations to restrict its index values.

So in Swift it might hypothetically look like this:

typealias DayOfWeek = 1...7 // range as a type, but probably not the best syntax

let dowNames: [DayOfWeek: String] = [ /* ... */ ]

let weekend: [DayOfWeek] = [6, 7]

let tuesday = DayOfWeek(2)! // returns nil if out of range

It seems to me that the generalisation for this would be to a compile-time assertion where it would be a compile-time error if the test either evaluated to false or could not be evaluated (so using a variable known only at run-time would cause a compile-time failure). It would therefore never generate any run-time code.

Would this need an addition to the compiler, or can it be done with macros (I'm not at all familiar with them)?

This is not type-level solution but may be helpful. I solved the similar task using UInt8 type clamped to values in range 1...3. Values less than 1 turned to 1 and more than 3 are turned to 3, without throwing error.
Macro can help make this even better - values can be validated to be in range 1...3 at compile time, like URL macro does.

1 Like

Sorry, my suggestion was tongue-in-cheek. I donโ€™t recommend encoding floating point numbers as types.

3 Likes

1++. Beyond their usefulness for anything else, interval type could reduce app's memory footprint, for example:

typealias IntX = Int.min + 1 ... Int.max

var x: [Int?]  = ... // 16 bytes per element
var y: [IntX?] = ... // 8 bytes per element

This would be a compilation error:

_ = dowNames[100]!     // ๐Ÿ›‘ Compilation error
let x: DayOfWeek = 100 // ๐Ÿ›‘ Compilation error
1 Like

To elaborate for those that might be confused by these numbers, this is referring back to Memory-optimal collections of Optionals. Giving up even just a single value in the Element's space allows (in principle) Optional to use that to represent nil, thus saving it from having to have a separate boolean flag and therefore unintentionally doubling the size of each Array because of the need to maintain correct alignment for Element.

Not just limited, but very small. Because defining explicit enum cases for e.g. the range of Int, would be not just comically inefficient in size (on the order of a hundred billion billion bytes of source code) but practically guaranteed to never compile (due to super-linear compile-times for enums based on their member count).

Even a more "plausible" scenario like a mere Int8-sized thing would be many kilobytes of code just for the cases, not even counting the various helper methods to convert to actual numeric values. So we're talking tens of kilobytes of source to express the notion typealias DomainSpecificInt = UInt8 where Self ~= 0...250.

And even if all of that isn't a showstopper, the ergonomics of enums for numeric values are horrible - you have to manually conform them to all the relevant numeric protocols, which is a lot of surface area (just try implementing your own FixedWidthInteger and see how many dozens and dozens of methods & properties that requires).

2 Likes

I donโ€™t think that there is a way to do this without using Enum. It makes sense because the compiler needs a type. Iโ€™m pretty sure โ€œnot 1 or 2 or 3โ€ is as much not a type as โ€œis 1 or 2 or 3โ€

It can also be done via struct with static constants or OptionSet, e.g.:

public struct Scale {
  private let value: UInt8

  private init(_ value: UInt8) { self.value = value }

  public static let one = Self.init(1)
  public static let two = Self.init(2)
  public static let three = Self.init(3)
}

The problem is that you are forced to use lexical names instead of numbers, this types are not ExpressibleByIntegerLiteral. I mean this way you need to write foo(scale: .one) instead of foo(scale: 1) where 1 is checked at compile time to be in range 1...3.

2 Likes