var result: Float = 0.0
result = float * integer * uint8 + double
// here, all operands should be implicitly promoted to Double before the complete expression evaluation.
You would have this produce different results than:
let temp = float * integer * uint8
“temp” is now a Float
result = temp + double
“temp" being implicitly converted to a Double before expression evaluation
(assuming the var “result” is still the Float declared in my example)
after conversion, the expression would be implicitly, type wise
float = Float( double + double)
Apart perhaps from small floating point inherent imprecision, I don’t see much difference...
am I missing something?
That would be extremely surprising to many unsuspecting users.
Don’t get me wrong; I *really want* implicit promotions (I proposed one scheme for them way back when Swift was first unveiled publicly). But there’s a lot more subtlety around them than it seems (for example the C and C++ implicit promotion rules can easily be described on a half-sheet of paper, but are the source of *innumerable* bugs). I would rather have no implicit promotions than half-baked implicit promotions.
I C what you mean. yes, implicit conversion can produce unexpected results, yes.
Have to be used either with common sense or a lot of debugging or both :o)
But that’s the case with many programming language features...
(no debuggers in ca 1980, IDEs are pure luxury :o)
TedvG
···
On 19. Jun 2017, at 19:58, Stephen Canon <scanon@apple.com> wrote:
On Jun 19, 2017, at 11:46 AM, Ted F.A. van Gaalen via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:
if it were possible, I would only use one floating point type
in my app, however FP conversions are needed often,
because of heavy usage of library functions
e.g. SceneKit that all work with different floating point types..
Programmers should be aware of the implications of conversions
e.g. learn and test these in playground if not completely understood...
TedvG
···
On 19. Jun 2017, at 22:44, John McCall <rjmccall@apple.com> wrote:
On Jun 19, 2017, at 1:58 PM, Stephen Canon via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:
On Jun 19, 2017, at 11:46 AM, Ted F.A. van Gaalen via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:
var result: Float = 0.0
result = float * integer * uint8 + double
// here, all operands should be implicitly promoted to Double before the complete expression evaluation.
You would have this produce different results than:
let temp = float * integer * uint8
result = temp + double
That would be extremely surprising to many unsuspecting users.
Don’t get me wrong; I *really want* implicit promotions (I proposed one scheme for them way back when Swift was first unveiled publicly).
I don't! At least not for floating point. It is important for both reliable behavior and performance that programmers understand and minimize the conversions they do between different floating-point types.
@Stephen@John:
Interesting to learn about the low-level things, thank you,
but efficient or not. somewhere along the way,
conversions are simply unavoidable,
whether explicit or implicit,
regardless of its performance...
Theoretically, doing:
aDouble = Double(aFloat)
should have the same performance as
aDouble = aFloat //implicitly
The compiler simply generates the same code in both cases, i assume.
Implicit or explicit? Thinking further, it seems to me that it doesn’t
matter because the programmer should be equally aware of the
operation, whether it is an explicit or an implicit conversion.
(seems to be not so difficult assisted by verbose compiler warnings during editing)
So, thinking along this line, (that is, the programmer has to make
almost the same judging effort in both cases anyway) it seems
logically correct that explicit conversions are in fact superfluous, pointless,
and can thus be removed from the language, with possibly the exception for
explicit conversion functions that have extra parameters to influence
the conversion, like its precision, magnitude, rounding etc.
e.g:
anInt = Int(aFloat, truncation: .roundingUp)
?
TedvG
···
On 20. Jun 2017, at 00:08, Stephen Canon <scanon@apple.com> wrote:
On Jun 19, 2017, at 5:43 PM, David Sweeris <davesweeris@mac.com <mailto:davesweeris@mac.com>> wrote:
Sent from my iPhone
On Jun 19, 2017, at 13:44, John McCall via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:
On Jun 19, 2017, at 1:58 PM, Stephen Canon via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:
On Jun 19, 2017, at 11:46 AM, Ted F.A. van Gaalen via swift-evolution <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:
var result: Float = 0.0
result = float * integer * uint8 + double
// here, all operands should be implicitly promoted to Double before the complete expression evaluation.
You would have this produce different results than:
let temp = float * integer * uint8
result = temp + double
That would be extremely surprising to many unsuspecting users.
Don’t get me wrong; I *really want* implicit promotions (I proposed one scheme for them way back when Swift was first unveiled publicly).
I don't! At least not for floating point. It is important for both reliable behavior and performance that programmers understand and minimize the conversions they do between different floating-point types.
How expensive is it?
On most contemporary hardware, it’s comparable to a floating-point add or multiply. On current generation Intel, it’s actually a little bit more expensive than that. Not catastrophic, but expensive enough that you are throwing away half or more of your performance if you incur spurious conversions on every operation.
This is really common in C and C++ where a naked floating-point literal like 1.2 is double:
float x;
x *= 1.2;
Instead of a bare multiplication (current generation x86 hardware: 1 µop and 4 cycles latency) this produces a convert-to-double, multiplication, and convert-to-float (5 µops and 14 cycles latency per Agner Fog).