In light of some echoes that the need for reasync
is still doubtful for some, I would like to present you a case study that demonstrates this need. The short version: I can already obtain the bulk of the functionality with a sed
script, functionality which my code requires in order to get out of the libdispatch ghetto.
Since this case study involves a bit of code to, well, study, I thought I would start with some credentials. I have an electrical engineering background; while my first programming experiments were with AppleSoft Basic on the Apple ][e, my professional career so far has been 15 years of writing multimedia infrastructure code in C and C++; this code frequently involves the network and the filesystem, if not both simultaneously (e.g. sendfile()
). I am looking at Swift as an improvement in almost all ways from C++, in particular the way Swift attempts to capture programmer intent ought to make merge operations much more straightforward. I have defended the thesis that you always ought to be able to invert control of your code if necessary (tenth comment), even in the absence of closures, and I have actually done so myself on a number of occasions (e.g. when integrating an XML parser with a SAXbased API, or when porting an Apache plugin to nginx, or outside of work when coding against the raw JavaScript async filesystem APIs, etc.). That is, until I tried to invert control in the case of the algorithm Iâ€™m about to present, for concurrency purposes (yes, I have â€ťsucceededâ€ť. Yes, the resulting code is awfully complex. No, I wonâ€™t share it). Ironically, that was a few weeks before the first review of concurrency features; since then, I have become a convert of the async/await model, and of the Swift approach to that concept in particular: I think no one should ever have to invert control of his code ever again, even with the help of closures.
This is the code for a Countdown solver (if youâ€™ve seen an earlier version, it has been updated somewhat: in particular, appending a new floor is now completely separate from adding a new operation to that floor).
//
// Created by Pierre Lebeaupin on 22/04/2023.
// Copyright Â© 2023 Pierre Lebeaupin. All rights reserved.
//
// Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
//
// 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
//
// 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
//
// 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
//
import Foundation;
enum Op: String {
case additive = "+", multiplicative = "*";
func neutralValue() > UInt32 {
return ((self == .additive) ? 0 : 1);
}
func combine(_ a: UInt32, _ b: UInt32) > UInt32 {
return ((self == .additive) ? (a + b) : (a * b));
}
func uncombine(_ a: UInt32, _ b: UInt32) > UInt32 {
return ((self == .additive) ? (a  b) : (a / b));
}
}
struct ReadyValue {
let value: UInt32;
let op: Op?;
let description: () > String;
// Note that as a closure, it can depend on more than
// the stored properties, contrary to a computed
// property.
init(_ inValue: UInt32) {
value = inValue;
op = nil;
description = { return inValue.description;};
}
init(value inValue: UInt32,
op inOp: Op,
description indescription: @escaping () > String) {
value = inValue;
op = inOp;
description = indescription;
}
}
func resolve(_ startGoal: UInt32,
_ incb: @escaping (_ result: String) > Void,
_ primitives: UInt32...)
{
var l = [ReadyValue]();
var composedCount = 0;
var otherComposedCount = 0;
for element in primitives {
l.append(ReadyValue(element));
}
{
exploreAdditionOfFloor(kind: .additive);
exploreAdditionOfFloor(kind: .multiplicative);
}(); // https://bugs.swift.org/browse/SR12243
func exploreAdditionOfFloor(kind: Op) {
guard otherComposedCount == 0 else {
return;
}
otherComposedCount = composedCount;
composedCount = 0;
{
exploreAdditionOfRightNode(skip: 0);
}(); // https://bugs.swift.org/browse/SR12243
composedCount = otherComposedCount;
otherComposedCount = 0;
func exploreAdditionOfRightNode(skip: Int) {
var decomposedValue = [false: kind.neutralValue(),
true: kind.neutralValue()];
// Siblings are the other right nodes on the same floor
// Downstream ones are those created while this one is active
var downstreamSiblingsRecruitmentSkip = Int(0); // placeholder
{
iteratePossibleLeftNodes(startingFrom: skip, { _ in return nil;});
}(); // https://bugs.swift.org/browse/SR12243
/*
The point of iterate_possible_left_nodes() is to enumerate all the possible ways
to recruit candidates into the current operation, out of the pool of possible
candidates; that pool has been determined by explore_addition_of_right_node().
Obviously, the number of possibilities to pick a nonstrict subset out of N
elements is 2 to the Nth power: each candidate in the pool can either be
recruited (1) or not (0) by this composition. Some of these possibilities are
invalid, of course, but we will explore the solution space that way.
By counting the possibilities from 00...0 to 11...1, we would in fact enumerate
all the possible recruitments for this composition. For practical reasons, we
will go backwards, from 11...1 to 00...0.
What we will do in practice is evaluating the first candidate, first recruting
it, which means we start with:
 1x...x
then adjust the recruitment location, and delegate to ourselves. That way,
assuming this function performs it job for N1, it will perform the job from
11...1 to 10...0; in other words, half the job for N. Then, we remove that
candidate from the composition, and ihnibit it from any future recruitment for
this composition; then we start over the loop.
The result is that the next candidate will be evaluated and hired at first and
recursive delegation will occur, resulting in these possibilities being
explored:
 01...x
That is, from 011...1 to 010...0.
At the next loop iteration, the possibilities explored will start with 001, etc.
up until we realize we excluded all candidates; at which point the only
combination left is 00...0, which we don't need to explore at all. Unless, of
course, there were only one or two candidates in the pool, in which case that
test triggers even earlier.
There is additional accounting to avoid a downstream sibling (a composition on
the same floor) recovering candidates in a way that results in a composition
that we ourselves explored earlier, but that is the basic principle.
*/
func iteratePossibleLeftNodes(startingFrom: Int,
_ walkOperands: @escaping
(_ action: (_ value: ReadyValue,
_ reverse: Bool)
> Void?)
> Void?) {
let emptyComposition = (walkOperands({_,_ in return ();}) == nil)
/* Why that odd upper bound? If that conditional is true, emptyComposition is
true, which means we are recruiting the senior contribution. If by doing so we'd
end up skipping a composed contribution (the other part of the conditional means
there is at least one left) as a result of the initial if, otherComposedCount
would never reach 0 no matter how many siblings we'd add, and so we'd never be
able to close the floor anyway. So it'd be pointless to keep going. */
for candidateReverseOffset in
startingFrom ..< ((emptyComposition && (otherComposedCount > 0)) ? startingFrom+1 : (l.count  composedCount) ) {
if emptyComposition {
downstreamSiblingsRecruitmentSkip = candidateReverseOffset;
}
/* Note: new compositions are added at the end of the array, i.e. at the high offsets,
while primitives (if any left) will be at the low offsets. As a result, the candidates
to be recruited as a first priority are found at the high offsets first, though not so
high that we would end up recruiting from the same floor (forbidden). */
let rightChild = l.remove(at: (l.count  composedCount  1)  candidateReverseOffset);
if let _ = rightChild.op {
otherComposedCount = 1;
}
defer {
if let _ = rightChild.op {
otherComposedCount += 1;
}
l.insert(rightChild, at: ((l.count+1)  composedCount  1)  candidateReverseOffset);
}
for phase in 0...1 {
let reverse = (phase == 1);
{ (_ valueComponent: inout UInt32) in
valueComponent = kind.combine(valueComponent,
rightChild.value);
}(&decomposedValue[reverse]!);
defer {
{ (_ valueComponent: inout UInt32) in
valueComponent = kind.uncombine(valueComponent,
rightChild.value);
}(&decomposedValue[reverse]!);
}
let selfNode = {(_ action: (_ value: ReadyValue,
_ reverse: Bool)
> Void?)
> Void? in
action(rightChild, reverse) ?? walkOperands(action);
};
iteratePossibleLeftNodes(startingFrom: candidateReverseOffset,
selfNode);
// close current composition
var num = 0;
guard (selfNode({_,_ in guard num == 0 else { return (); }; num += 1; return nil;}) != nil) && ( (kind == .additive)
? decomposedValue[false]!
> decomposedValue[true]!
: ((decomposedValue[false]!
% decomposedValue[true]!) == 0) ) else {
continue;
}
let realizedValue = kind.uncombine(decomposedValue[false]!,
decomposedValue[true]!);
let description = { () > String in
var current = "(";
var isFirst = true;
var numRev = 0;
selfNode({(_ value: ReadyValue,
_ freverse: Bool) > Void? in
guard !freverse else {
numRev += 1;
return nil;
}
if !isFirst {
current += " ";
current += kind.rawValue;
current += " ";
}
isFirst = false;
current += value.description();
return nil; // keep going;
});
if numRev > 0 {
isFirst = true;
current += " ";
current += ((kind == .additive) ? "" : "/");
current += " ";
if numRev > 1 {
current += "(";
}
selfNode({(_ value: ReadyValue,
_ freverse: Bool) > Void? in
guard freverse else {
return nil;
}
if !isFirst {
current += " ";
current += kind.rawValue;
current += " ";
}
isFirst = false;
current += value.description();
return nil; // keep going;
});
if numRev > 1 {
current += ")";
}
}
current += ")";
return current;
};
guard l.count > 0 else {
if realizedValue == startGoal {
incb(description());
}
continue;
}
composedCount += 1;
l.append(ReadyValue(value: realizedValue,
op: kind,
description: description));
defer {
l.remove(at: l.count  1);
composedCount = 1;
}
exploreAdditionOfRightNode(skip: downstreamSiblingsRecruitmentSkip);
exploreAdditionOfFloor(kind: ((kind == .additive) ? .multiplicative : .additive));
}
}
}
}
}
}
//
// main.swift
// AlternateSolver
//
// Created by Pierre Lebeaupin on 22/04/2023.
// Copyright Â© 2023 Pierre Lebeaupin. All rights reserved.
//
import Foundation
resolve(6372, {print($0);}, 10, 5, 8, 3, 25, 1, 42);
If youâ€™d like to understand how it works, here is an explanation (feel free to skip to the part where I transform the code if this is too much). A potential solution such as (1+3)*7 is best formalised as a binary tree, where leaves are the primitive amounts, and interior nodes are operations (either one of +, , *, or /); every node has a value: either that of the primitive or the result of the operation. For a candidate tree to be a solution, the value of the root node must equal the goal.
Actually exploring the solution space in this way, however, would result in solutions that are equivalent to each other, for instance ( (1 + 3) + 4 ) * 7 is obviously equivalent to ( (1 + 4) + 3 ) * 7, and to ( (3 + 4) + 1 ) * 7. But those are three different binary trees (and I am not even mentioning the equivalences where two operands are just swapped). The key observation, since addition is associative, is to group â€śunbrokenâ€ť additions into a single addition operation which can have from 2 to many operands: (1 + 3 + 4) * 7. The additions must be unbroken: there is no further regrouping to be done in the case of ((1 + 3) * 7) + 4, for instance, since that tree has no equivalent, or at least none that are a consequence of associativity.
Multiplications, being also associative, must also be grouped in the same fashion. And to integrate subtraction and division, an operand is allowed to contribute to its operation in a reverse fashion: 4  1 becomes (4 + rev(1)), and 10 / 2 becomes (10 * rev(2)). But all of this would be pointless if groupings are not canonicalised, so we must mandate additive operations to only take as operands either primitive ones or the results of multiplicative operations, or a mix, and mandate multiplicative operations to only take as operands either primitive ones or the results of additive operations, or a mix; by browsing the problem space this way, we already eliminate a significant chunk of redundancy. By forcing operands to be ordered (we will see what that means in a bit), we eliminate yet more, and the redundancies that remain can usually be handled on a casebycase basis.
None of the above is breaking new ground, and I have seen implementations of that idea that rely on the Python itertools
module for instance. But here is what I came up with on top of that.
I build up my candidate, now nonbinary tree from the bottom up: first I add a floor, which is a collection of operations that have the same distance from the root operation; for instance, in ((2 + 3) * 4) + 7, 2+3 is on the bottom floor, the multiplicative operation implicating 4 is on the first elevation, and the additive operation implicating 7 is on the second and last elevation. Note that as a result of the mandate, a given floor can never contain a mix of operations: this is obviously the case for the root operation since it is alone in that floor, and a floor that contains only additive operations forces the floor below, which is made of the operations directly feeding into those, to be made only of multiplicative operations, and viceversa, so we have just recursively proven the property for all floors. As a result, the type of operation alternates between each floor: if we seed the bottom floor with an additive operation, additive operations will only occur on even elevations (and the bottom floor), and multiplicative operations will only occur on the oddnumbered elevations.
But for now we only have the bottom floor, we donâ€™t even know how high the root operation will be. So we start by adding an empty additive operation to that floor, and see where that takes us; once that is done, we instead add an empty multiplicative operation, and see where that takes us: this is the first bifurcation.
Inside either of these bifurcations, we take a node from the available operands list to add it to the operation, starting from the first available one, but we will also later try adding a further available node as the senior operand instead; that is the second bifurcation. In either case, the node can be added as either a forward or reverse operand: that is a further bifurcation. And now that a node has been added as a contribution to the operation, we can explore either adding a further node from the available ones (and further bifurcate from that depending on which we pick), or freezing the operation, which is another bifurcation still (of course, we canâ€™t freeze an operation that only has a single contribution: that bifurcation path is closed off when we detect that in code, so assume there are at least two contributions). And once weâ€™ve frozen the operation, we can choose to add another operation to that floor (which must be of the same type as a result), or close that floor and open a new one just above, in which case the operations from the justclosed floor become available as operands. And the choices can now recursively bifurcate from theseâ€¦
There are a bunch of checks and what nots that are performed everywhere. For instance, you canâ€™t close an elevation unless all operations from the floor below that elevation have been recruited as operands in that elevation: otherwise, youâ€™d have an available node that just couldnâ€™t be recruited anymore. Why? Because it couldnâ€™t be recruited by a higher floor either since that would result in its distance from the root being inconsistent with that of its neighbours from the same floor (such a possibility will be explored, but when it comes time to do so, not by hijacking the function dedicated to exploring the possibilities resulting from its current elevation). As another example, to avoid exploring (1 + 2) * (3 + 4) then (3 + 4) * (1 + 2), we donâ€™t just force the multiplication to recruit its operands in order, we also prevent the second addition from recruiting 1 if the first addition, which was earlier in the same floor, has recruited 3 as its senior member. And these checks sometimes combine: if those additions are on an elevation already, with one multiplication in the bottom floor, the first addition cannot recruit anything other than that multiplication as its senior member, since that would prevent any future addition on the same floor from recruiting the first multiplication, in turn preventing any makeup of that floor from closing as long as that first addition has anything other than the first multiplication as its senior member; so might as well cut off this whole bifurcation early.
Here is an example of a state we can find ourselves in
_a: a*4*â€¦
_a: 2+3_
Available primitives: [7]
Skipped operands for the current operation: []
Skipped primitives for the current floor: []
In this representation, operations in a given floor are designated with a, b, c, etc. operations in an elevation will refer to recruits from the floor immediately below them using these letters (no need to specify the floor, since they can only recruit from the one immediately below them). Here, the ground floor is closed: this is necessary anyway in order to have a floor above it. The multiplication in the first elevation is not closed yet: it can still recruit operands. When starting from this state, we can either:

Recruit 7 as a forward member of the currently open operation (the multiplication on the first elevation):
_a: a*4*7*â€¦ _a: 2+3_ Available primitives: []
At which point we cannot recruit any more, so the only possibility left is to close that operation, and other checks result in the tree being considered complete, so the only thing left is to compare the result of the newly minted root node with the goal.

Recruit 7 as a reverse member of the currently open operation (the multiplication on the first elevation):
_a: (a*4/7)*â€¦ _a: 2+3_ Available primitives: []
At which point we cannot recruit any more, so the only possibility left is to close that operation; but that is rejected as 7 does not evenly divide 20, which means exploration of that alternative is over.

Close the current operation asis:
_a: a*4_ _a: 2+3_ Available primitives: [7]
At which point a check detects we do not have enough recruitable operands to start a new operation on the same floor, so we close the floor and open another:
 _a: a*4_ _a: 2+3_ Available primitives: [7]
I donâ€™t think it represents a spoiler if I reveal that only one operation can be made to fit in that floor, whose recruits can only be a and 7; a cannot be recruited as a reverse contribution (the result would be negative regardless of how 7 contributes), so the only two valid final states are:
_a: a+7_ _a: a*4_ _a: 2+3_ Available primitives: []
_a: a7_ _a: a*4_ _a: 2+3_ Available primitives: []
And in each case, the recursion terminates by comparing the result of the root node with the goal.
That is of course a simple case; a complex one can be:
_a: a*10_b: 8*25â€¦
_a: 1+4+5_
Available primitives: []
Skipped operands for the current operation: [9]
Skipped primitives for the current floor: [6]
Which can be completed in a number of ways.
Honestly, describing the algorithm in prose will only take you so far: I recommend you run the code under the debugger and play with it, by first stepping through the first recruitments, then putting a breakpoint for when the goal is reached and looking at what kind of function stack can lead there and how the solution tree is browsed to print the result, then my favourite technique: put a oneoff breakpoint on each line (easy with a script), so you can just continue each time and be sure the next breakpoint hit will be a situation you havenâ€™t seen before.
A few notes on the implementation proper:

it is wellknown that you can take a list of nonbinary trees, and reversibly transform that into a binary tree where a rightchild relationship corresponds to that between a node and its senior child, and a leftchild relationship corresponds to that between a node and its next sibling in the nonbinary tree. Since I represent the list of contributions of a given operation as a linked list, this binarization is a natural way to express the various bifurcations when exploring the solution space:

exploreAdditionOfRightNode()
is what we call to explore the bifurcation where we add a new operation, since it (more specifically, its senior node) will end up being the child in a rightchild relationship to be determined later, 
iteratePossibleLeftNodes()
is what we call when an operation is open to a new node being recruited, resulting in a new leftchild relationship. Unless we are recruiting a senior node, in which case this does not correspond to that, so indeed the function is slightly misnamed: an earlier version of that function was used only for nonsenior nodes so the name was deserved, and since then the implementations for recruiting a node regardless of whether it will have a senior role or not have been mergedâ€¦ but the function retained its name.
In effect, that naming scheme is best thought of as an allusion to this binarization; where it makes best sense is when browsing an actual solution in order to print it, where the closures will come from the relevant functions.


In effect, the main recursion is between
iteratePossibleLeftNodes()
and itself, though not necessarily directly.iteratePossibleLeftNodes()
can invoke: itself directly

exploreAdditionOfRightNode()
, which ends up delegating toiteratePossibleLeftNodes()

exploreAdditionOfFloor()
, which callsexploreAdditionOfRightNode()
, which ends up delegating toiteratePossibleLeftNodes()
In particular, it is up to
iteratePossibleLeftNodes()
to close the current operation prior to delegating to eitherexploreAdditionOf()
function, so it can be thought of as the bulk of the logic; the two others are best thought as the way of allocating on the stack the necessary additional information for a floor and for an operation, respectively. 
In order to support the checks, many calculations intermediates and related bookkeeping are kept around. Some, appropriately, are temporaries kept on the stack which are not captured by any anonymous closure, this is the case for the ongoing totals for an operation that is not closed yet. The total for a closed operation is a bit of an outlier, as it is stored in the structure for a recruit: this allows the same structure field to serve as the cached total for a composite recruit, and as the source of truth in the case of a primitive recruit. This was done for efficiency reasons: no redundant storage in the case of primitive recruits, and no code divergence when the goal is to obtain this value without probing the recruit any deeper, saving an unpredictable branch and cycles in general. However, storage in the linked list of contributions is immutable, for reasons we will see later.