Another Double vs CGFloat glitch

Getting "Ambiguous use of operator '/'" in this fragment. I believe this is another glitch of Double vs CGFloat interop support.

import SwiftUI

struct ContentView: View {
    private let x: CGFloat = 10 // works with Double
    var body: some View {
        Text("Hello, world!")
            .opacity((x - 100) / 100) // works with / 100.0
        // Error: Ambiguous use of operator '/'
    }
}

@main
struct TheApp: App {
    var body: some Scene {
        WindowGroup {
            ContentView()
        }
    }
}
1 Like

This is not a glitch, 100 is not implicitly converted into a CGFloat because it defaults to Int, so it behaves just like Int or Float variables, that's why it works with 100.0 - the default type for a floating-point literal is Double. This was a deliberate choice because supporting non-floating point literals adds yet more possibility for ambiguities.

Hmm, I didn't realize that not supporting integer literals was a deliberate choice; I don't recall that choice being discussed as part of the proposal. In this case it adds to rather than removes ambiguity. Was the inclusion of support for integer literals ever prototyped?

Yes it was prototyped and tested and it doesn't create ambiguity here either, instead supporting non-floating point literals to CGFloat would instead farther exacerbate the problem here because the solver would find solutions with (Double, Double), (CGFloat, CGFloat) as well as generic solutions either by converting x or 100 in either position which means more solutions instead of fewer.

Right. I'm saying that the choice not the support integer literals here creates ambiguity, and supporting integer literals would remove ambiguity. It goes without saying that the tradeoff would be more solutions, but in your prototyping was it unworkably many in terms of performance impact?

And I'm saying that is not exactly true because opacity expects Double according to SwiftUI documentation, so:

import Foundation

func test(_: Double) {
}

var x: CGFloat = 100
test((x / 100) / 100)

This type-checks because there is /(CGFloat, CGFloat) -> CGFloat overload and 100 can be converted to CGFloat without using implicit conversion, which is always preferred over implicit conversion. So there is only one conversion required here - test(Double((x / 100) / 100))`. The problem is most likely comes from other overloads that get imported which probably allow heterogeneous operators, allowing integer literals to be implicitly converted in such cases would not fix the problem but exacerbate it.

Yes, it would fail multiple projects in compatibility suite with relatively simple operator combinations because of the number of combination the solver would have to try.

2 Likes

Yikes, good to know.

Got it. That test((x / 100) / 100) works is good enough for me! This is what I had understood to be the case; I had interpreted your reply to mean that it was explicitly decided that this should not be supported.

1 Like

No worries, sorry for confusion!

I do not understand the logic here.

let x: CGFloat = 10
.opacity((x - 100) / 100)

"x" is CGFloat -> so the type of "(x - 100)" expression should be inferred as "CGFloat" -> so the type of "((x - 100) / 100)" expression should be inferred as "CGFloat" -> so there is no ambiguity as the resulting type is "CGFloat". No?

You would be right if opacity wanted a CGFloat, but in fact, as Pavel says, it wants a Double.

1 Like

Count me in the surprised people.

To be honest, I think I've already met this problem in my own code. But my C/ObjC reflexes quickly had me append the .0 suffix and move on: the glitch did not raise to my full consciousness.

But not all Swift programmers come with such muscle memory. Some of them may not even know that appending the .0 suffix is the solution, because they are used to Swift integer literals that "just work".

Aren't literals that "just work" part of the Swift Signature? Shouldn't this glitch be reported as a bug?

2 Likes

Then the naïve logic continues and suggests "look, the method parameter wants "double" and the passed value is CGFloat -> so this is the place where the magic "CGFloat to Double" conversion happens -> no ambiguity".

Interestingly it accepts both "Double" and "CGFloat" (but not "Int"). Can this fact be at play here?

struct ContentView: View {
    var body: some View {
        let int = 0
        let double = 0.0
        let cgfloat: CGFloat = 0
        
        Text("Hello, world!")
            .opacity(0)
            .opacity(0.0)
            .opacity(double)
            .opacity(cgfloat) // ok
            // .opacity(int) // Error: Cannot convert value of type 'Int' to expected argument type 'Double'
    }
}

Given that the method accepts both "Double" and "CGFloat", is that the source of ambiguity? "Look, I don't know what to do, either to call "opacity(_ v: CGFloat)" version directly given that the parameter is "CGFloat", or whether to convert from "CGFloat" to "Double" first and call "opacity(_ v: Double)" variant.

You should be able to use Double in declarations which previously required CGFloat, that was one of the ergonomic improvements that the proposal is designed for.