Skip to main content

Events

The events interface provides EventStream instances for robust event streaming with dynamic chunking, retry logic, and block context metadata.
This uses the Contract pattern - a copyable implementation you add to your codebase. EventStream itself is a library primitive.

Basic Usage

import { Contract } from './Contract.js';  // Your local copy

const usdc = Contract({
  address: '0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48',
  abi: erc20Abi,
  provider
});

// Get EventStream for Transfer events
const stream = usdc.events.Transfer({});

// Backfill historical events
for await (const { log, metadata } of stream.backfill({
  fromBlock: 18000000n,
  toBlock: 19000000n
})) {
  console.log(`Chain head ${metadata.chainHead}: ${log.args.value}`);
}

// Watch for new events
for await (const { log } of stream.watch()) {
  console.log('New transfer:', log.args);
}

Filtering Events

Filter by indexed parameters when creating the stream:
// Transfers from specific address
const stream = usdc.events.Transfer({ from: '0x742d35...' });

// Transfers to specific address
const stream = usdc.events.Transfer({ to: '0x742d35...' });

// Transfers between specific addresses
const stream = usdc.events.Transfer({
  from: '0xsender...',
  to: '0xreceiver...'
});

Backfill Historical Events

Use backfill() to fetch events from a specific block range:
const stream = usdc.events.Transfer({});

for await (const { log, metadata } of stream.backfill({
  fromBlock: 18000000n,
  toBlock: 18001000n
})) {
  console.log(`Found transfer at block ${log.blockNumber}`);
}
// Loop ends after processing historical events

Dynamic Chunking

EventStream automatically handles large block ranges by chunking requests:
  • Starts with 100 blocks per request
  • Reduces chunk size by 50% on “block range too large” errors
  • Increases chunk size by 25% after 5 consecutive successes
  • Never goes below 10 blocks minimum
// Handle million-block range efficiently
for await (const { log } of stream.backfill({
  fromBlock: 0n,
  toBlock: 19000000n,
  chunkSize: 200,       // Initial chunk size (default: 100)
  minChunkSize: 50      // Minimum after reduction (default: 10)
})) {
  processEvent(log);
}

Watch for New Events

Use watch() to poll for new events:
const stream = usdc.events.Transfer({});

for await (const { log, metadata } of stream.watch({
  pollingInterval: 1000  // Poll every second (default)
})) {
  console.log('New transfer:', log.eventName);
}

Watch from Specific Block

for await (const { log } of stream.watch({
  fromBlock: 19000000n  // Start watching from this block
})) {
  console.log(log);
}

Cancellation with AbortSignal

Use AbortSignal to cleanly stop streaming:
const controller = new AbortController();

// Stop after 10 seconds
setTimeout(() => controller.abort(), 10000);

try {
  for await (const { log } of stream.watch({
    signal: controller.signal
  })) {
    console.log(log);
  }
} catch (error) {
  if (error instanceof EventStreamAbortedError) {
    console.log('Stream stopped');
  }
}

Backfill Then Watch

Combine backfill and watch for complete event history:
const stream = usdc.events.Transfer({});

// First: get historical events
for await (const { log } of stream.backfill({
  fromBlock: 0n,
  toBlock: currentBlock
})) {
  processEvent(log);
}

// Then: watch for new events
for await (const { log } of stream.watch({
  fromBlock: currentBlock
})) {
  processEvent(log);
}

EventStreamResult Structure

Each yielded result contains the log and metadata:
type EventStreamResult = {
  log: {
    eventName: string;
    args: {
      from: AddressType;
      to: AddressType;
      value: bigint;
    };
    blockNumber: BlockNumberType;
    blockHash: HashType;
    transactionHash: TransactionHashType;
    logIndex: number;
  };
  metadata: {
    chainHead: bigint;      // Current chain head block number
    fromBlock: bigint;      // Chunk start block
    toBlock: bigint;        // Chunk end block
  };
};

Standalone EventStream

EventStream is a library primitive you can use directly:
import { EventStream } from './EventStream.js';

const stream = EventStream({
  provider,
  address: '0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48',
  event: {
    type: 'event',
    name: 'Transfer',
    inputs: [
      { type: 'address', name: 'from', indexed: true },
      { type: 'address', name: 'to', indexed: true },
      { type: 'uint256', name: 'value', indexed: false }
    ]
  },
  filter: { from: userAddress }
});

Retry Configuration

Configure retry behavior for transient errors:
for await (const { log } of stream.backfill({
  fromBlock: 0n,
  toBlock: 1000000n,
  retry: {
    maxRetries: 5,         // Max retry attempts (default: 3)
    initialDelay: 1000,    // Initial delay ms (default: 1000)
    maxDelay: 30000        // Max delay ms (default: 30000)
  }
})) {
  processEvent(log);
}

Error Handling

import { BlockRangeTooLargeError, EventStreamAbortedError } from './EventStream.js';

try {
  for await (const { log } of stream.backfill({ fromBlock: 0n, toBlock: 1000000n })) {
    console.log(log);
  }
} catch (error) {
  if (error instanceof EventStreamAbortedError) {
    console.log('Stream was cancelled');
  } else if (error instanceof BlockRangeTooLargeError) {
    console.log('Block range exceeded limit');
  } else {
    console.error('Unexpected error:', error);
  }
}