Idea for declaring expected types and value ranges

It may be nice to have a keyword that tells the compiler to try to statically prove certain conditions are true after some point without needing to leave the scope or wrap it in an if-let or something else that happens at runtime. This way the compiler could check the validity of the assertion and it would do so at compile time.

For example, imagine using something like this to unwrap an optional beyond a certain point in the function body:

func doSomething() -> Int {
  var something: Int? = nil

  if something == nil {
     something = 42
  }

  require something: Int

  return something * 2
}

The compiler would be expected to prove that at the “require” line, “something” could be an “Int” (non-optional). If the compiler cannot prove that, it is an error. If it can prove it, the type of the variable is effectively changed from that point on (the compiler effectively changes all instances of “something” to “something!” - but it knows it should never fail which might mean it could avoid doing certain checks and it eliminates a piece of state from the mind of the programmer).

This idea could be extended in other ways, potentially, if you could specify rules in a flexible enough way. For example, imagine being able to declare a constraint on the ranges of basic value types:

require x in 0…1
let result = someValue * x

Ideally, the compiler would then attempt to prove that x must always be in the range of 0…1 at this point. If it cannot prove this to be true, then it would be a compiler error.

Going farther, you could use the same thing on type declarations themselves so you could have a Float that could only contain 0…1:

var scaler: Float in 0…1

Or build a function that has constrained input ranges:

func scale(x: Float in 0…1) -> Float {
  return someValue * x
}

I like the idea of this always being a compile-time check when possible, but it could also be useful to force the requirement to be evaluated at runtime by adding a force operator to it:

func scale(x: Float in! 0…1) -> Float {
  return someValue * x
}

In this scenario the compiler would probably still try to prove it statically - and if it could, it’d skip runtime checks - but if it cannot, it would then insert runtime guards that catch requirement failures and crash as necessary - similar to a forced unwrap - instead of being a compile time error.

l8r
Sean

How is this different from guard or precondition()?

func f(x: Int) {
    guard (0...1).contains(x) else {
        return
    }
    // use x
}

func f(x: Int) {
    precondition((0...1).contains(x))
    // use x
}

Pozdrawiam – Regards,
Adrian Kashivskyy

···

Wiadomość napisana przez Sean Heber <sean@fifthace.com> w dniu 03.12.2015, o godz. 23:12:

It may be nice to have a keyword that tells the compiler to try to statically prove certain conditions are true after some point without needing to leave the scope or wrap it in an if-let or something else that happens at runtime. This way the compiler could check the validity of the assertion and it would do so at compile time.

For example, imagine using something like this to unwrap an optional beyond a certain point in the function body:

func doSomething() -> Int {
var something: Int? = nil

if something == nil {
    something = 42
}

require something: Int

return something * 2
}

The compiler would be expected to prove that at the “require” line, “something” could be an “Int” (non-optional). If the compiler cannot prove that, it is an error. If it can prove it, the type of the variable is effectively changed from that point on (the compiler effectively changes all instances of “something” to “something!” - but it knows it should never fail which might mean it could avoid doing certain checks and it eliminates a piece of state from the mind of the programmer).

This idea could be extended in other ways, potentially, if you could specify rules in a flexible enough way. For example, imagine being able to declare a constraint on the ranges of basic value types:

require x in 0…1
let result = someValue * x

Ideally, the compiler would then attempt to prove that x must always be in the range of 0…1 at this point. If it cannot prove this to be true, then it would be a compiler error.

Going farther, you could use the same thing on type declarations themselves so you could have a Float that could only contain 0…1:

var scaler: Float in 0…1

Or build a function that has constrained input ranges:

func scale(x: Float in 0…1) -> Float {
return someValue * x
}

I like the idea of this always being a compile-time check when possible, but it could also be useful to force the requirement to be evaluated at runtime by adding a force operator to it:

func scale(x: Float in! 0…1) -> Float {
return someValue * x
}

In this scenario the compiler would probably still try to prove it statically - and if it could, it’d skip runtime checks - but if it cannot, it would then insert runtime guards that catch requirement failures and crash as necessary - similar to a forced unwrap - instead of being a compile time error.

l8r
Sean

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

Part of this suggestion seems to be achievable using `guard`. Some,
however, isn't.

Folks may be interested in reading about Refinement Types:

···

On Thu, Dec 3, 2015 at 2:29 PM, Adrian Kashivskyy <adrian.kashivskyy@me.com> wrote:

How is this different from guard or precondition()?

func f(x: Int) {
    guard (0...1).contains(x) else {
        return
    }
    // use x
}

func f(x: Int) {
    precondition((0...1).contains(x))
    // use x
}

Pozdrawiam – Regards,
Adrian Kashivskyy

Wiadomość napisana przez Sean Heber <sean@fifthace.com> w dniu
03.12.2015, o godz. 23:12:

It may be nice to have a keyword that tells the compiler to try to
statically prove certain conditions are true after some point without
needing to leave the scope or wrap it in an if-let or something else that
happens at runtime. This way the compiler could check the validity of the
assertion and it would do so at compile time.

For example, imagine using something like this to unwrap an optional
beyond a certain point in the function body:

func doSomething() -> Int {
var something: Int? = nil

if something == nil {
    something = 42
}

require something: Int

return something * 2
}

The compiler would be expected to prove that at the “require” line,
“something” could be an “Int” (non-optional). If the compiler cannot prove
that, it is an error. If it can prove it, the type of the variable is
effectively changed from that point on (the compiler effectively changes
all instances of “something” to “something!” - but it knows it should never
fail which might mean it could avoid doing certain checks and it eliminates
a piece of state from the mind of the programmer).

This idea could be extended in other ways, potentially, if you could
specify rules in a flexible enough way. For example, imagine being able to
declare a constraint on the ranges of basic value types:

require x in 0…1
let result = someValue * x

Ideally, the compiler would then attempt to prove that x must always be in
the range of 0…1 at this point. If it cannot prove this to be true, then it
would be a compiler error.

Going farther, you could use the same thing on type declarations
themselves so you could have a Float that could only contain 0…1:

var scaler: Float in 0…1

Or build a function that has constrained input ranges:

func scale(x: Float in 0…1) -> Float {
return someValue * x
}

I like the idea of this always being a compile-time check when possible,
but it could also be useful to force the requirement to be evaluated at
runtime by adding a force operator to it:

func scale(x: Float in! 0…1) -> Float {
return someValue * x
}

In this scenario the compiler would probably still try to prove it
statically - and if it could, it’d skip runtime checks - but if it cannot,
it would then insert runtime guards that catch requirement failures and
crash as necessary - similar to a forced unwrap - instead of being a
compile time error.

l8r
Sean

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

The primary difference would be that I’m suggesting the checks would be occurring at compile-time rather than run-time (unless it was forced to be runtime with !). The idea of applying range (or perhaps other kinds of rules) on values is something that, I think, would be quite useful in many applications and is, perhaps, a different concept.

l8r
Sean

···

On Dec 3, 2015, at 4:29 PM, Adrian Kashivskyy <adrian.kashivskyy@me.com> wrote:

How is this different from guard or precondition()?

func f(x: Int) {
    guard (0...1).contains(x) else {
        return
    }
    // use x
}

func f(x: Int) {
    precondition((0...1).contains(x))
    // use x
}

Pozdrawiam – Regards,
Adrian Kashivskyy

Wiadomość napisana przez Sean Heber <sean@fifthace.com> w dniu 03.12.2015, o godz. 23:12:

It may be nice to have a keyword that tells the compiler to try to statically prove certain conditions are true after some point without needing to leave the scope or wrap it in an if-let or something else that happens at runtime. This way the compiler could check the validity of the assertion and it would do so at compile time.

For example, imagine using something like this to unwrap an optional beyond a certain point in the function body:

func doSomething() -> Int {
var something: Int? = nil

if something == nil {
    something = 42
}

require something: Int

return something * 2
}

The compiler would be expected to prove that at the “require” line, “something” could be an “Int” (non-optional). If the compiler cannot prove that, it is an error. If it can prove it, the type of the variable is effectively changed from that point on (the compiler effectively changes all instances of “something” to “something!” - but it knows it should never fail which might mean it could avoid doing certain checks and it eliminates a piece of state from the mind of the programmer).

This idea could be extended in other ways, potentially, if you could specify rules in a flexible enough way. For example, imagine being able to declare a constraint on the ranges of basic value types:

require x in 0…1
let result = someValue * x

Ideally, the compiler would then attempt to prove that x must always be in the range of 0…1 at this point. If it cannot prove this to be true, then it would be a compiler error.

Going farther, you could use the same thing on type declarations themselves so you could have a Float that could only contain 0…1:

var scaler: Float in 0…1

Or build a function that has constrained input ranges:

func scale(x: Float in 0…1) -> Float {
return someValue * x
}

I like the idea of this always being a compile-time check when possible, but it could also be useful to force the requirement to be evaluated at runtime by adding a force operator to it:

func scale(x: Float in! 0…1) -> Float {
return someValue * x
}

In this scenario the compiler would probably still try to prove it statically - and if it could, it’d skip runtime checks - but if it cannot, it would then insert runtime guards that catch requirement failures and crash as necessary - similar to a forced unwrap - instead of being a compile time error.

l8r
Sean

_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution