I am trying to shift RGB to a ColorInt like I did in Java.
Java: The code below returns '-16777216' for a black color.
int a = 255; int r = 0; int g = 0; int b = 0; int hcol = 0; hcol |= (a & 0xFF) << 24; hcol |= (r & 0xFF) << 16; hcol |= (g & 0xFF) << 8; hcol |= b & 0xFF; System.out.println(hcol);
Swift: From what I know this should be the same exact code as the Java example. But it is returning only positive numbers and is '0' for black where '16777216'(not a negative) is white.
let a = 255 let r = 0; let g = 0; let b = 0; var colInt = 0; //colInt |= (a & 0xFF) << 24; (Adding this makes the variable even bigger) colInt |= (r & 0xFF) << 16; colInt |= (g & 0xFF) << 8; colInt |= (b & 0xFF); print("\(colInt)");
As you can see in the Swift example I don't have the alpha color. If I add that it just makes the colInt an even higher number than the max amount of colors in the RGB spectrum.