let x = 10
let y = 2.5

let result = x * y print(result)

A. there’s no issue, the result is 25

B. Is the issue that x is a double and y is an int?

C. X is an int and y is a double, code won’t compile.

D. There is no issue, the result is 20.

I personally think it’s the second one but some help figuring out the correct (if that’s not) and a explanation would be super helpful. Thanks guys

This seems to be formatted incorrectly, is there a missing `.` between `2` and `5`?

Can you wrap your code in "```" like so:

`````````
let x = 10
let y = 25
```
``````

?

Yes, sorry I was missing the decimal place.

So then yes, your intuition is correct -- Swift will infer `2.5` as a `Double` if no other type is provided, and `10` as an `Int` if no other type is provided.

You could provide a type for `x` explicitly like so:

``````let x: Double = 10
let y = 2.5
let result = x * y
print(result)
``````

And even though 10 doesn't have a decimal, the compiler is smart enough to create a `Double` from an "integer literal" value.

For more info on types and type inference for literals, see the section on this page of the Swift book called "Types and Type Inference"

https://docs.swift.org/swift-book/LanguageGuide/TheBasics.html

1 Like