Thin IoObject wrapper around the shared basekit List container (DATA(self) is a plain List*). The tag installs clone/mark/free/compare function pointers; marking walks every element so the GC keeps contained objects alive. Iteration primitives (each, foreach, reverseForeach) have two execution paths: when state->currentFrame is set they stamp the frame's controlFlow.foreachInfo union and return immediately so the iterative eval loop drives the iteration (see FRAME_STATE_FOREACH_* in IoState_iterative.c); otherwise a LIST_SAFEFOREACH recursive fallback is used, which is exercised during VM bootstrap before the eval loop starts. Mutation methods set IoObject_isDirty_ so Store/persistence layers can notice change. asEncodedList / fromEncodedList implement a compact binary round-trip used by the object serialization machinery.
Bounds-check helper used by the mutating at/atPut/atInsert/removeAt family. Returns 1 and raises a VM error when the index is out of range. If allowsExtending is nonzero the valid range is extended to size (for atInsert appending at the tail).
Registered as the tag's compareFunc. Falls back to default pointer comparison against non-Lists; otherwise compares by size then by element-wise IoObject_compare. Returns the first non-zero element comparison, mirroring lexicographic ordering.
Single-argument foreach: evaluates the message against each element without binding index/value slots. Under the iterative evaluator it stamps fd->controlFlow.foreachInfo with isEach=1 so the frame state machine (FRAME_STATE_FOREACH_EVAL_BODY) sets the target rather than mutating the caller's locals. The recursive fallback below runs only before state->currentFrame is available during bootstrap.
Registered as the tag's freeFunc. Frees the backing basekit List; the contained IoObjects are GC-managed and not touched here. Aborts on a double free so use-after-free regressions surface loudly.
Registered as the tag's markFunc. Walks every element so contained IoObjects stay live for the GC; called during every mark phase.
Convenience constructor: looks up the registered proto and clones it. Used by C callers that want a fresh, empty List without going through message machinery (e.g. foreach/keys/values return paths).
Builds the List tag and installs clone/free/mark/compare function pointers. Stream write/read slots are left unset (the commented-out block below preserves the legacy BStream format).
Wraps a pre-existing basekit List as an IoList by freeing the fresh List allocated in the clone and adopting the caller's pointer. Used by slice/join/asEncodedList where the underlying container has already been built.
Creates the List proto, attaches a fresh basekit List as its data pointer, and wires up the Io-visible method table (access, mutation, iteration, slicing, sorting, encoded serialization, join). Called once during VM init; all later Lists are clones of this proto.
Appends every element of a basekit List (not an IoList) into self, IOREFing each through this List. Used by appendSeq and by callers that already hold a basekit List pointer.
Low-level append used from C. Preferred over Io-level append when a value is being threaded through internal machinery and IoMessage dispatch would be wasteful.
Auto-extending set: pads with nils up to index i before writing, unlike List_at_put_ which would abort out of bounds. Used by the Io-visible atPut after its bounds check.
Low-level element set used from C. IOREFs the value so the GC keeps it alive through this List, and marks the List dirty for persistence.
Registered as the tag's cloneFunc. Gives the clone its own basekit List copy so mutation of one List does not leak into the proto.
Linear search by IoObject_compare equality (not pointer identity) so Numbers and Sequences compare by value. Returns -1 on miss.
Shared argument parser for slice / sliceInPlace. Reads step first (rejecting zero), then start and an optional end defaulting to size, and normalizes both indices via IoList_sliceIndex.
Python-style slice index normalization: negative values wrap from the end, and out-of-range values clamp to the correct end depending on step direction. Mutates *index in place.
qsort_r callback for sortInPlace with an optional message key: evaluates the key expression on each side inside a retain pool (so intermediate allocations don't leak) and returns the IoObject_compare result.
qsort_r callback for sortInPlaceBy(aBlock). Primes the two pre-built argument messages with cached results and activates the compare block; returns 1 when the block result is false so ISFALSE(cr) swaps the pair.