Embedded Swift

In order for a and c to be valid, somePointer would need to be a pointer to an array of at least 0x1'0000'0000'0000'0000 elements, which isn't possible if size_t is 64 bits.

2 Likes

These are both undefined behavior. N1256 6.5.6 para 8.

It’s not clear that would affect anything. The UnsafePointer types are treated specially by the compiler, so a call to advanced(by:) doesn’t actually become a method call. But even if it were, the type of the argument doesn’t necessarily affect the types of the intermediate evaluations the same way it does in C.

Array conforms to Collection with an Int index type, which means that Array’s valid indices must all fall within the range of Int (so that i < self.endIndex for all valid is). In other words, the extra range made available by a type like UInt is by definition useless to Array, so there’s just no point in an overload.

1 Like

I think Array is a red herring due to a mistake I made above, comparing C’s array subscripting syntax to Swift’s Array instead of UnsafePointer. UnsafePointer’s subscript also takes an Int, but UnsafePointer doesn’t conform to Collection.

1 Like

The standard operations on Swift's pointers are meant for working within a single object and try to maintain basic provenance / aliasing restrictions that are compatible with still doing memory optimizations. I don't think that's at all unreasonable, even if it makes implementing an allocator more challenging.

I agree that Swift should offer operations that allow correct allocator implementation. The trouble is that I'm pretty sure we'll have to figure out what those operations look like on our own. Our predecessor languages all famously punt on that — there's no way to square the ability to implement an allocator in C with the ability to still do basic memory optimizations using rules like "different objects don't alias", and yet C compilers still regularly do those basic memory optimizations. So instead we've just been muddling through with the same somewhat underspecified and probably inadequate rules that C has around casting to integers.

9 Likes

Good to know.

I tried this on a few different C playgrounds
#include <stdio.h>
#include <stdlib.h>
#include <assert.h>

int main() {
    assert(sizeof(void*) == 8);
    char bytes[] = { 3, 4 };
    char *p = bytes + 1;
    char b = p[-1];
    char a = p[0xFFFFFFFFFFFFFFFF]; 
    unsigned long u = 0xFFFFFFFFFFFFFFFF;
    char c = p[u];
    printf("%d %d %d\n", b, a, c);
    assert(a == b && a == c);
    printf("done\n");
    return 0;
}

and it worked everywhere. I understand that as this is UB there might be a 64-bit environment in the wild where that code will not work correctly, and I just haven't found it yet.

From a 2-complement binary math point of view adding FF...FF is equivalent to subtracting 1, and adding/subtracting a pointer to an integer is no different than adding/subtracting two integers, which explains why it worked everywhere for me so far (you need to do something extra / special in the compiler for that to fail like an overflow check, or treating certain bits of the pointers specially, or asserting a pointer to be within known allocation areas throughout the whole range starting with the current value and ending with the incremented value, or using a 1-complement numbers, etc).

Bear in mind that the nature of UB means that these "64-bit environment[s] in the wild" need to include future versions of the current environment, as well as versions of the current environment where the context of the compilation changes enough from your tests to divert the behaviour.

UB promises you nothing. The fact that it works in the playground provides you no protection.

5 Likes

This might be a stupid question but I don’t think I saw anything in the vision document: will async functions and other concurrency features be available in embedded mode? I know concurrency has a lot of moving parts but environments like embedded Wasm on the Web could provide their own executors that rely on built-in JS functions. Atomics are also supported AFAIK so we should also be able to implement a simple mutex.

2 Likes

Is this something that could eventually be fixed or is it too niche of a use case? :slight_smile:

It appears to be mentioned as partially implemented here swift/docs/EmbeddedSwift/EmbeddedSwiftStatus.md at main · apple/swift · GitHub

I'm curious what some of these mean though, some features are either "No" or "Intentionally unsupported", but existentials for example are "Currently disallowed". What's the difference between "No, currently disallowed" and "No"? I don't see it explained anywhere

1 Like

I'm getting weird false positive errors within Xcode even though my code compiles and runs correctly:

Example
public extension Drawable {
    static func == (lhs: Self, rhs: Self) -> Bool {
        guard lhs.width == rhs.width && lhs.height == rhs.height else { return false }
        for x in 0..<lhs.width {
            for y in 0..<lhs.width {
                guard lhs[x, y] == rhs[x, y] else { return false }
            }
        }
        return true
    }
    
    static func == (lhs: Self, rhs: some Drawable) -> Bool {
        guard lhs.width == rhs.width && lhs.height == rhs.height else { return false }
        for x in 0..<lhs.width {
            for y in 0..<lhs.width {
                guard lhs[x, y] == .init(rhs[x, y]) else { return false }
            }
        }
        return true
    }
}

In the example above the most indented line (line 6) is getting the error Cannot use metatype of type 'RGBA' in embedded Swift. What does that mean? All that's used on this line is the Equatable implementation of RGBA which is derived automatically by Swift.

The error

The error only starts to show up when I make Renderer (which renders drawables) itself a Drawable:

extension Renderer: Drawable {
    public subscript(x: Int, y: Int) -> RGBA { self.display[x, y] }
}

The example code is not causing the error by itself so below are some relevant incomplete definitions copied from my code:

Renderer and Color (and RGBA)
public struct Renderer/*: ~Copyable */{
    internal var display: Image<RGBA>
    
    public var width: Int { self.display.width }
    public var height: Int { self.display.height }
    
    internal init(width: Int, height: Int) {
        self.display = .init(width: width, height: height, color: .black)
    }
    
    public mutating func resize(width: Int, height: Int) {
        self.display = .init(width: width, height: height, color: .black)
    }
    
    public mutating func clear(with color: some Color = RGBA.black) {
        for x in 0..<self.display.width {
            for y in 0..<self.display.height {
                self.pixel(x: x, y: y, color: color)
            }
        }
    }
    
    public mutating func pixel(x: Int, y: Int, color: some Color = RGBA.white) {
        if x < 0 || y < 0 || x >= display.width || y >= display.height { return }
        self.display[x, y] = .init(color)
    }
    
    public mutating func draw(_ drawable: some Drawable, x: Int, y: Int) {
        for ix in 0..<drawable.width {
            for iy in 0..<drawable.height {
                // TODO(!) Handle opacity with blending modes. The blending api needs design.
                let color = drawable[ix, iy]
                if color.a == 255 {
                    self.pixel(x: ix + x, y: iy + y, color: color)
                }
            }
        }
    }
   ...
}

public protocol Color: Equatable {
    var r: UInt8 { get }
    var g: UInt8 { get }
    var b: UInt8 { get }
    var a: UInt8 { get }
    
    init(r: UInt8, g: UInt8, b: UInt8, a: UInt8)
    init(luminosity: UInt8, a: UInt8)
}

public extension Color {
    /// This initializer makes all color layouts interchangeable at compile time as long
    /// as they are representable by 8 bit rgba values.
    init(_ other: some Color) {
        self.init(r: other.r, g: other.g, b: other.b, a: other.a)
    }
}

public struct RGBA: Color {
    public let r, g, b, a: UInt8
    
    public init(r: UInt8, g: UInt8, b: UInt8, a: UInt8 = 255) {
        self.r = r
        self.g = g
        self.b = b
        self.a = a
    }
    
    public init(luminosity: UInt8, a: UInt8 = 255) {
        self.r = luminosity
        self.g = luminosity
        self.b = luminosity
        self.a = a
    }
    ...
}
Drawable
public protocol Drawable<Layout>: Equatable {
    associatedtype Layout: Color
    var width: Int { get }
    var height: Int { get }
    subscript(x: Int, y: Int) -> Layout { get }
}

public extension Drawable {
    func slice(x: Int, y: Int, width: Int, height: Int) -> DrawableSlice<Self> {
        .init(self, x: x, y: y, width: width, height: height)
    }
    
    func grid(itemWidth: Int, itemHeight: Int) -> DrawableGrid<Self> {
        .init(self, itemWidth: itemWidth, itemHeight: itemHeight)
    }
    
    func colorMap<C: Color>(map: @escaping (C) -> C) -> ColorMap<Self, C> { .init(self, map: map) }
    
    func colorMap<C: Color>(_ existing: C, to new: C) -> ColorMap<Self, C> {
        self.colorMap { $0 == existing ? new : $0 }
    }
    
    func flatten() -> Image<Layout> { .init(self) }
}

I am confused why this even happens, I don't see how these are connected.

1 Like

Would you be able to write a reproducer for this and file a github issue?

1 Like

@kubamracek How's the going? I'm running into an odd compilation error where -import-bridging-header is not recognized while attempting to build the esp32 example swift-embedded-examples/esp32-led-strip-sdk/README.md at 69dc23e718c5b20797daec17d4754cd42bb21c24 · apple/swift-embedded-examples · GitHub

I'm using the toolchain swift-6.0-DEVELOPMENT-SNAPSHOT-2024-04-30-a.xctoolchain

Here's a snippet of the output

<unknown>:0: error: unknown argument: '-import-bridging-header'
[7/948] Generating ../../partition_table/partition-table.bin
Partition table binary generated. Contents:

Do you have any suggestion on how to remedy this issue?
Thanks

-import-bridging-header was added pretty recently. So the error means that your CMake build is not picking up the downloaded toolchain, and it's probably falling back to your Xcode's swift compiler. Did you export the TOOLCHAINS env var before running the CMake configuration steps?

Also (but I don't think that's the problem here) I recommend using "trunk development / main" toolchain instead of a 6.0 toolchain. There are more features/bugfix related to Embedded Swift in main compared to 6.0.

Blockquote
Did you export the TOOLCHAINS env var before running the CMake configuration steps?

Yes I did. I even added --toolchain to the xrun with xcrun --toolchain org.swift.59202405011a in CMakeLists.txt and the error still remains.

I am using the snap shot from May 1: swift-DEVELOPMENT-SNAPSHOT-2024-05-01-a.xctoolchain

When I manually run the command xcrun --toolchain org.swift.59202405011a swift --version it returns the default version used by Xcode.

I have tried the steps on two different machines and both produce the same issue. Both machines are running macOS 14.4.1.

Could I be missing a step in getting the toolchains to be switched?

What's the output of xcode-select -p? If that's pointing to command line tools, I find those have trouble finding development snapshot toolchains, you have to point to an actual Xcode installation with xcode-select -s.

swiftlang-5.10.0.13

This is not a Swift compiler from a downloaded toolchain (it should say 6.0-dev or similar). Can you run xcrun ... -f swift to see the path it's picking up?

Here's the output for sudo xcrun --toolchain "org.swift.59202405011a" -f swift

It is picking up Xcode's version. Same output on the second machine.

/Library/Developer/CommandLineTools/usr/bin/swift is not Xcode's version, it's command-line tools version.

Please make sure that xcode-select -p points to your Xcode installation, update it with sudo xcode-select -s <Your_Xcode_path.app>/Contents/Developer if that's not the case, that should fix the issue.

2 Likes

My apologies, I meant to say the CommandLineTools version