I have my own type named Color
, so instead of using SwiftUI.Color
to disambiguate, in the context where the SwiftUI.Color
type is known, I tried just .init(red:green:blue:alpha)
:
import SwiftUI
let a1 = LinearGradient(
gradient: Gradient(colors: [Color(red: 3 / 255, green: 251 / 255, blue: 232 / 255), // Calling Color(...), it can infer parameter type
.init(red: 215 / 255, green: 3 / 255, blue: 252 / 255), // Calling .init(...), it cannot infer parameter type is double
// ^^^ error: cannot convert value of type 'Int' to expected argument type 'Double'
.init(red: 252.0 / 255, green: 202.0 / 255, blue: 3.0 / 255) // must explicitly use Double literal
]),
startPoint: .topLeading, endPoint: .bottomTrailing)
with .init(...)
, the compiler should know the red:green:blue:alpha
parameters are Double
, but it appears it cannot tell.