#expect(throws:) on enum with associated value?

I'd like to write a test that expects this error:

enum Errors : Error, Equatable {
	case invalidToken(String, String.Index)
}

like so:

#expect(throws: Lexer.Errors.invalidToken)
{
	try lexer.tokenize(s)
}

I also tried appending .self, as shown in the docs. But I get Type '@Sendable (String, String.Index) -> Lexer.Errors' cannot conform to 'Equatable' (and Error).

I don't think there's any way to express this in Swift, is there?

What is the example doing in this section? It doesn’t show the definition of PizzaToppings.InvalidToppingError.

The self is used to refer to the type, not the associated case or its constructor. So #expect(throws: Lexer.Errors.self) should work.

You can also match a specific value with #expect(throws: Lexer.Errors.invalidToken(s, s.startIndex)), but there currently isn't a way to use wildcards _ like in a switch statement.

4 Likes

In Swift 6.1 it looks like #expect(throws:) returns the error caught. Combined with something like swift-case-paths, you can get something like:

let error = #expect(throws: Lexer.Errrors.self) {
  try lexer.tokenize(s)
}

#expect(error?.is(\.invalidToken) == true)
3 Likes

You don't actually ever need _. This is still horrible but I don't think it's worse than case paths:

// 1 "line"
#expect({
  if case .invalidToken = (#expect(throws: Lexer.Errors.self) { try lexer.tokenize(s) })
  { true } else { false }
}())

// 2 "lines"
let error = #expect(throws: Lexer.Errors.self) { try lexer.tokenize(s) }
#expect({
  if case .invalidToken = error { true } else { false }
}())

As it turns out, this is what I need to do anyway (I was taking the resulting values and testing them separately, but this is much tidier). E.g:

s = "‑3⁄8"
var error = #expect(throws: Lexer.InvalidTokenError.self)
{
    tokens = try lexer.tokenize(s)
}
#expect(error?.source == "‑3⁄8")
#expect(error?.location == error?.source.startIndex)

vs

s = "‑3⁄8"
#expect(throws: Lexer.Errors.invalidToken(source: s, range: s.startIndex ..< s.index(after: s.startIndex)))
{
    tokens = try lexer.tokenize(s)
}