EffectTalk
Back to Tour
core-conceptsIntermediate

Use Chunk for High-Performance Collections

Use Chunk<A> as a high-performance, immutable alternative to JavaScript's Array, especially for data processing pipelines.

Guideline

For collections that will be heavily transformed with immutable operations (e.g., map, filter, append), use Chunk<A>. Chunk is Effect's implementation of a persistent and chunked vector that provides better performance than native arrays for these use cases.


Rationale

JavaScript's Array is a mutable data structure. Every time you perform an "immutable" operation like [...arr, newItem] or arr.map(...), you are creating a brand new array and copying all the elements from the old one. For small arrays, this is fine. For large arrays or in hot code paths, this constant allocation and copying can become a performance bottleneck.

Chunk is designed to solve this. It's an immutable data structure that uses structural sharing internally. When you append an item to a Chunk, it doesn't re-copy the entire collection. Instead, it creates a new Chunk that reuses most of the internal structure of the original, only allocating memory for the new data. This makes immutable appends and updates significantly faster.


Good Example

This example shows how to create and manipulate a Chunk. The API is very similar to Array, but the underlying performance characteristics for these immutable operations are superior.

import { Chunk, Effect } from "effect";

// Create a Chunk from an array
let numbers = Chunk.fromIterable([1, 2, 3, 4, 5]);

// Append a new element. This is much faster than [...arr, 6] on large collections.
numbers = Chunk.append(numbers, 6);

// Prepend an element.
numbers = Chunk.prepend(numbers, 0);

// Take the first 3 elements
const firstThree = Chunk.take(numbers, 3);

// Convert back to an array when you need to interface with other libraries
const finalArray = Chunk.toReadonlyArray(firstThree);

Effect.runSync(Effect.log(finalArray)); // [0, 1, 2]

Anti-Pattern

Eagerly converting a large or potentially infinite iterable to a Chunk before streaming. This completely negates the memory-safety benefits of using a Stream.

import { Effect, Stream, Chunk } from "effect";

// A generator that could produce a very large (or infinite) number of items.
function* largeDataSource() {
  let i = 0;
  while (i &#x3C; 1_000_000) {
    yield i++;
  }
}

// ❌ DANGEROUS: `Chunk.fromIterable` will try to pull all 1,000,000 items
// from the generator and load them into memory at once before the stream
// even starts. This can lead to high memory usage or a crash.
const programWithChunk = Stream.fromChunk(
  Chunk.fromIterable(largeDataSource())
).pipe(
  Stream.map((n) => n * 2),
  Stream.runDrain
);

// ✅ CORRECT: `Stream.fromIterable` pulls items from the data source lazily,
// one at a time (or in small batches), maintaining constant memory usage.
const programWithIterable = Stream.fromIterable(largeDataSource()).pipe(
  Stream.map((n) => n * 2),
  Stream.runDrain
);