I would like to create a circular reference

I want to create the following:

import Foundation

// swiftc -parse-as-library myMain.swift

class Global
{
   let i    :Int
   let next :Global?
   
   init( _ i:Int, _ next:Global? )
   {
      self.i      = i
      self.next   = next
   }

}

let g = Global( 1, h ) // Swift complains about circular reference here
let h = Global(2, g)

@main
class MainProgram
{
    static func main()
    {
        print("g.i=\(g.i), g.next!.i=\(g.next!.i), h.i=\(h.i), h.next!.i=\(h.next!.i)")
    }
}

Swift is being nice by warning me that I'm creating a circular reference; but what if this is really want I want to do (g's next is h and h's next is g).

Note that the next field is immutable (important here). I realize this is probably messing with ARC. I tried "weak var next", but that didn't solve anything; plus I really want this to be immutable; if mutable were okay, I could make the assignment to next after initialization, for example:

class Global
{
   let i    :Int
   weak var next :Global?
   
   init( _ i:Int, _ next:Global? )
   {
      self.i      = i
      self.next   = next
   }

}

let g = Global( 1, h )
let h = Global(2, nil)

@main
class MainProgram
{
    static func main()
    {
         h.next = g
        print("g.i=\(g.i), g.next!.i=\(g.next!.i), h.i=\(h.i), h.next!.i=\(h.next!.i)")
    }
}

But I'd still prefer that the next field be immutable.

How do I accomplish this?

I'm pretty sure that, if you make the second argument an autoclosure, and make h and implicitly unwrapped optional, you could get it to work in swift 5 land. That falls down with swift 6 changes if I understand correctly.

Swift isn’t really complaining here about the fact that you’ve created a circular (memory) reference, which is totally legal even if generally unadvised for memory-leak reasons - it’s complaining about the fact that you’ve created a circular (variable) reference, that you’re trying to use h in initializing g before you’ve actually created it.

Aside from unsafe pointer hacks, there’s no way for the next field to be totally immutable while also being set to a value that’s created after you initialize the instance - but, by using computed properties, you can make it so that Global enforces that its next field can only be set once. It won’t be something that is enforced by the compiler (you’ll either have to throw an error or crash the program if something tries to set the next field after the first time), but it’s probably the closest you can get to what you want.

1 Like

Yet the following works:

class Global
{
   let i    :Int
   let next :Global?
   
   init( _ i:Int, _ next: Global? )
   {
      self.i      = i
      self.next   = next
   }

}

let g = Global( 1, h )
let h = Global(2, nil)

@main
class MainProgram
{
    static func main()
    {
        print("g.i=\(g.i), g.next!.i=\(g.next!.i), h.i=\(h.i)") //, h.next!.i=\(h.next!.i)")
    }
}

even though I'm using h before declaring and initializing it.

I was under the impression that for global (top-level) declarations, Swift preallocates storage for the objects at compile/load time (though that storage is initially all zero bits). Therefore the value of h (that is, the pointer to that storage) should be valid even though the data it points at is not. This theory is supported by the fact that the code above runs and produces the expected output. However, once I change:

let h = Global(2, nil)

to

let h = Global(2, g)

I get the circular reference error on the "let g = ..." statement.
So, " trying to use h in initializing g before you’ve actually created it" doesn't seem to actually explain what's going on here. In the first code above, I'm certainly using h to initialize g before declaring h (and that code works). It isn't until I create an actual circular reference that Swift starts complaining.

Now it could be that Swift is confused once I use " let h = Global(2, g)" because it thinks I can't use h until it is fully initialized (and likewise, I can't use g until it is fully initialized). However, as g and h are only pointers, and it's certainly possible to initialize those pointers without initializing the associated data, I would argue that this should work.

I will explore the computed property approach, but I'm still wonder what is the difference in these two examples.

I'm assuming the computed property mechanism you were suggesting is something like this:

class Global
{
   private var privateNext :Global?
   let i    :Int
   var next :Global { privateNext ?? self }
   
   init( _ i:Int )
   {
      self.i           = i
      self.privateNext = nil
   }
   
   func initialize( _ next: Global )
   {
      privateNext = next
   }

}

let g = Global( 1 )
let h = Global( 2 )
g.initialize( h )
h.initialize( g )

print("g.i=\(g.i), g.next.i=\(g.next.i), h.i=\(h.i), h.next.i=\(h.next.i)")

privateNext isn't truly immutable as you can call initialize(_:) multiple times, but I could probably live with something like this.

Output:

g.i=1, g.next.i=2, h.i=2, h.next.i=1

This is correct. Globals in Swift are lazily initialized, so while they can be validly referenced ahead of their own declaration, their first usage initializes their value. With

let g = Global(1, h)
let h = Global(2, g)

the compiler recognizes that in order to initialize h, it must evaluate g, and in order to evaluate g, it must evaluate h — this isn't done ahead of time, hence the circular reference. If it were allowed to run, you'd get an infinite loop in this lazy initialization cycle.

For better or worse, this isn't how Swift's initialization model works: since initializers can have side effects and run arbitrary code, objects must be fully initialized before they can be used (unlike Obj-C, which separates the allocation and initialization steps of object creation, with all of the pitfalls and safety/security implications). This means that h can't accept a reference to g unless g is fully initialized, since Global(2, g) can do anything with g (and vice versa with Global(1, h)).

The only way around this would be to drop down to the actual pointer layer, but at that point, you're on your own in terms of safety.

FWIW this is very fragile place in Swift. For example this compiles fine.... but crashes.

class B {}

class A {
   let b: B
   init(_ b: B) { self.b = b }
}

let a = A(b)
let b = B()
print(a.b) // EXC_BAD_ACCESS