To save some reading, I'm building an app that calculates 3 different values based on the user input and then displays which of those 3 is the lowest using the min function. Based on the user input, some of these values may be over 9,999. When this happens, the min function stops working properly. It seems to be ignoring one of the digits in values 10,000 and greater. As an example, it would return that 12000 is less than 2000. I'm sure the fix is small, but I'm new here so any help would be great. Thanks!
Can you post the code you’re using?
For the min function? Its:
let answer = min(paper, HotAirDryer, HighSpeedDryer)
If you're using NumberFormatter, the commas may be interpreted as decimal point, depending on the Locale
I'm not using any commas I added them here to make it easier to read
The problem is not min
, the problem is that you are not passing in the numbers you think you are to min
.
Put a breakpoint at the min
call, and inspect the actual values that are passed to it.
Figure out why those values are not what you thought they should be.
Can you post where paper
, HotAirDryer
, and HighSpeedDryer
are defined so we can reason about the types?
Sure. The code is a bit messy right now, but it's working otherwise. I can see the calculations are getting the proper values by connecting them to a label which displays the values. Anyway, here is the code:
if sender.isSelected && PaperView.isHidden == false && HotView.isHidden == false && SpeedView.isHidden == false && AdvancedMode.isHidden == false && PaperPercentView.isHidden == false && TowelEnergyView.isHidden == false && RenewablePercentView.isHidden == false {
let a = Double(1426.88)
let f = Double(1141.37)
let g = Double(21.14)
let h = Double(PaperRenewableResults.text!) //this is a user input text field
let o = Double(h! / 100)
let i = Double(46.60)
let p = Double(0.90)
let j = Double(1.00)
let k = Double(PaperPercentResults.text!)! / 100 //this is a user input text field
let l = Double(425.27)
let m = Int(TowelSliderResults.value) //this is a user input slider
let e = ((j-k)*l)
let d = i-((i-p)*o)
let c = f-((f-g)*o)
let b = c+d+e
let n = Int(a+b)
let paper = String(n*m)
let bb = Double(11.9)
let ff = Double(DryerEnergy.text!)! / 100 //this is a user input text field
let dd = 1 - ff
let ee = Double(179.90)
let gg = Double(HotSeconds.text!)! //this is a user input slider
let cc = dd*ee
let aa = bb+cc
let HotAirDryer = String(Int(aa*gg))
let bbb = Double(24.40)
let fff = Double(DryerEnergy.text!)! / 100
let ddd = 1 - fff
let eee = Double(121.60)
let ggg = Double(SpeedSeconds.text!)! //this is a user input slider
let ccc = ddd*eee
let aaa = bbb+ccc
let HighSpeedDryer = String(Int(aaa*ggg))
let answer = min(paper, HotAirDryer, HighSpeedDryer)
Why are you converting the values to Strings to use in the min
function?
I just removed the string conversion and now all seems to be working properly! I had needed it earlier on but no longer need it apparently. Thank you very much for your help.
String comparison uses a "lexicographic" ordering that works character-by-character starting from the front. So "120000000"
is less than "2"
just because "1"
is less than "2"
, and the rest of the string doesn't matter. This is a simple rule that works consistently for all strings, but as you can see, it can differ from numeric comparison when the characters of the string happen to look like a number. Generally you avoid problems like this by always working with numbers using numeric types until you specifically need to format them into text, e.g. to display them to a user.
Thanks John. I was just digging through the documentation to reference how Comparable
conformance worked for StringProtocol
conforming types to link for the OP and explain why this was wonky. Now it doesn’t matter that I couldn’t find it.
@Michael_Ilseman This seems like an unfortunate oversight in the string documentation. It's very clear that string comparison respects Unicode canonical equivalence, but it never actually says how strings are ordered!
I noticed that in my search. It’s one of those tribal knowledge situations, I think, and should be more explicit. That documentation is closed-source so someone would need to file a bug to Apple, right?
It's how non-locale-aware string comparison works essentially everywhere (except a few languages like PHP that make extremely ill-advised attempts to make string and numeric comparison consistent), but yeah, that doesn't mean we don't have a responsibility to document it ourselves.
I'll take of the bug.
That's intentional, or at least it was originally: you couldn't assume a stable order between different versions of the stdlib or even the same version across platforms. For example, in the pre-Swift-4.1 days, you actually had different ordering on Linux than Darwin. Since then, we've ordered based on scalar values in NFC on all platforms, but in theory that could be changed. E.g. scalar values in NFD order differently than in NFC.
Ordering of String
needs to be fast and obey canonical equivalence for use in things like sorted data structures, but the particular order is arbitrary and meaningless. That's why it was changed from UCA (slow) to NFC (fast) scalar value order (which is equivalent to NFC UTF-8 code unit order) way back in Swift 4.
If ordering is presented to a user or for any kind of non-machine-consumption purpose, you should use a higher level framework. Linguistic ordering varies dramatically across languages and even across applications (e.g. German phonebook order is different than German dictionary order).
With @Alejandro 's exciting work on native normalizations, we might start exposing normalization-oriented API, including "give me the fast one". At that point we might formalize the ordering.