Generalization of Implicit Conversions

I like this proposal overall.

From a library designer's perspective, I would like to have the ability to declare an initializer as @explicit, so the user would be required to specify the type.

extension ToType {
	@explicit init(from: FromType) {
		self.init(from)
	}
}

let anInstance = FromType() as ToType
let anInstance: ToType = FromType() // error: Explicit conversion required

What benefit would @explicit give over the current conversion idiom of directly using an initializer?

let anInstance = ToType(FromType())
8 Likes

I'm quite cold to the idea of allowing a library to vend implicit conversions for types other than its own. But I'm not sure what to do about that.

I agree. I’m uneasy with the overall idea of unregulated implicit conversions, but even more so with implicit conversions not being controlled by the type's author. I suppose the latter concern can rather easily be solved by requiring that the @implicit inits be within in the type declaration (and not extensions thereof). There’s also precedent for this with the property-wrapper special inits init(wrappedValue:) and init(projectedValue:).

2 Likes

I'm sympathetic to these concerns though part of the fun is for for developers to be able to define implicit conversions on [standard] library types in their own codebase. As a compromise, perhaps a restriction could be introduced that an @implicit public init() (i.e. one that is exporting) can only be defined on a type in the same module.

This creates problems for representing the CGFloat <-> Double conversion as it is bi-directional and the types span two modules but it seems to be the exception rather than the rule and could be either special cased or we could use the existing hard coded implementation.

I think that implicit conversions should only be supported as part of subtype relationships, and it's the responsibility of the author of a type to define its supertypes. Given that, I think it makes sense to require the subtype definition to be at the type declaration, like we require for subclasses. Apart from CGFloat <-> Double (which I don't think should be generalised further), are there any use cases where we'd want bidirectional conversions?

Additionally, how does your proposed solution interact with as? casts? For example would you expect:

let x: Int32 = 0
let y = x as Any
let z = y as? Int

to work if there's an implicit conversion from Int32 to Int?

2 Likes

It's a implementation detail of the prototype toolchain I've been using to test these ideas out that prompting a conversion using an as does not work. Whether it should is an open question. Is it important to be able to use as when you can call the initialiser instead?

I'm specifically referring to dynamic casts with as?, rather than type coercion with as. For example, this should work with any implicit conversions:

func takesAnUnsafeRawPointer(_ x: UnsafeRawPointer) { _ = x }
let p = UnsafeMutableRawPointer(bitPattern: 0xFFFF)
takesAnUnsafeRawPointer(p)

but I would also argue that:

func someOtherFunction(_ x: Any) {
    if let x = x as? UnsafeRawPointer {
        takesAnUnsafeRawPointer(x)
    }
}

let p = UnsafeMutableRawPointer(bitPattern: 0xFFFF)
someOtherFunction(p)

should work; i.e. takesAnUnsafeRawPointer should be called.

It's possible that it is reasonable to expect that it should but, that would require a change to the Swift runtime that implements as? which does not have enough information available in the binary to realise that conversion is available on the fly. Until now I've only been viewing this as a feature implemented at compile time.

This seems an awkward solution for a pitch with a goal of removing special cases.

1 Like

Very droll. It is a special case as it is bi-directional across modules. My interest in this pitch is to extend the functionality of the compiler in a systematic manner to allow for new implicit conversions to be added in a lightweight way without having to implement them in C++. I'm less interested in going back and rewriting existing functionality such as CGFloat <-> Double as it is such a unique case. One wonders why they are two distinct types at all.

Responding to the overall pitch, I don’t see the value of user-defined implicit conversions. So could folks share some examples, outside of the Standard Library, where implicit conversions would outweigh their compile-time penalty and complexity?

I say outside the Standard Library, because I consider the Standard Library a part of the Swift language. That’s important because implicit conversions could still be easily written in Swift with a hidden attribute, and reserved for language and system-frameworks use.

Of course, one may attribute implicit conversions’ complexity to the lack of a standardized feature. But I think we can agree that proper documentation is a much more significant factor in users understanding of the feature.

To clarify, I’m not against all custom, user-defined subtype relationships. For example, we could implement user-defined variance, to follow in the footsteps of Scala, Kotlin and many other languages:

// Covariance means that for any subtype `Sub` of Value,
// Container<Sub> is a subtype of Container<Value>.
struct Container<covariant Value> {
  let value: Value
}

var container = Container<Int>(value: 1)
let erasedContainer = container as Container<Any>

container = erasedContainer as Container<Int> ❌

I’d just be more comfortable starting with a more limited feature, and expanding from there.

1 Like

Here’s a thought:

One reason (perhaps the most common reason?) that people sometimes want implicit conversions is when a type T logically represents a value of another type U, just with a different API surface.

In that case, the type T acts as a high-level wrapper around a low-level raw value of type U, and it makes sense that one might want to pass a value of type T into a function that accepts a parameter of type U. After all, a T “conceptually is” a U.

And Swift already has a way to describe types with raw values, namely the RawRepresentable protocol. Perhaps we might consider ways in which RawRepresentable could facilitate this functionality.

One option would be to create a new protocol, say RawConvertible: RawRepresentable, with no additional requirements, that the compiler understands to mean, “Values of conforming types can be converted to their rawValue as needed.”

9 Likes

In my case, what I want isn't implicit conversions specifically so much as user-defined subtyping for structs/enums so that I can define my own existential wrappers (e.g. an enum in place of a protocol existential) or not-reference-counted-classes type hierarchies (e.g. integer-based handles represented as structs rather than classes) and still have generics and dynamic casts work as expected.

That's why implicit conversions without subtyping aren't particularly interesting to me – I'm not especially concerned about removing explicit conversions where I know the types involved. Rather, I'm interested in eliminating a class of problems where I've forgotten to manually cast between types that the compiler doesn't know are related but are, and being able to pass the subtypes without explicit casts is just a nice bonus.

2 Likes

So you want something similar to AnyHashable — implicit casting from some HashableAnyHashable and conformance of AnyHashable : Hashable. I think this is a discussion worth having and very different from implicit conversions as you mention.

One solution to your problem would be to add a new attribute for protocols:

@typeEraser(AnyHashable)
public protocol Hashable { ... }

extension AnyHashable {
  public init<Value: Hashable>(erasing value: Value) { ... }
}

It's important to note that this should only be used for cases where extending the existential type (should it be allowed in the future) won't yield the desired type-erasing functionality.

1 Like