OrderedSet bulk insert operation?

Is there a way to bulk insert into an OrderedSet?

If I insert 1_000_000 items individually at index 0 into an ordered set it doesn't finish. I think a bulk insert would allow me to do this in o(n) by only having to update internal indexes and array once?

let count = 1_000_000
var collection = OrderedSet<Int>(minimumCapacity: count)
for i in 0..<count {
    collection.insert(i, at: 0)

On the other hand if I append those same 1_000_000 items to an OrderedSet it takes a reasonable 0.097 sec.

let count = 1_000_000
var collection = OrderedSet<Int>(minimumCapacity: count)
for i in 0..<count {


Not an answer, but you can try a range

let count = 1_000
var collection = OrderedSet<Int>(minimumCapacity: count)
collection = OrderedSet(0...1_000_000)

Thanks, and yes that performs well. Most operations perform well. It's just the case where you need to insert many items at list start that is slow... and seems like bulk operation would solve that issue pretty easily.

Don't think there is any bulk insert and from the OrderedSet Documentation,

Inserting or removing a single member (or a range of members) needs to perform the corresponding operation in the storage array, in addition to renumbering any subsequent members in the hash table. Therefore, these operations are expected to have performance characteristics similar to an Array : inserting or removing an element to the end of an ordered set is expected to execute in O(1) operations, while they are expected to take linear time at the front (or in the middle) of the set.

Thus repeated calls to insert(_: at:) at the front or the middle of an OrderedSet will take longer than repeated append(_:) calls.

From Meet the Swift Algorithms and Collections packages

while OrderedSet is able to quickly look up existing elements and append new ones, it cannot efficiently implement removing or inserting an item at the front or middle of the set. Like array, these operations need to slide elements around in the storage array, but they also need to renumber subsequent indices in the hash table. This means that removals and insertions turn into operations with linear complexity, making these slower than the regular set. There is always a trade-off!

Terms of Service

Privacy Policy

Cookie Policy