Overview

Custom Serialization

Make class instances serializable across workflow boundaries using the WORKFLOW_SERIALIZE and WORKFLOW_DESERIALIZE symbol protocol.

This is an advanced guide. It dives into workflow internals and is not required reading to use workflow.

The Problem

Workflow functions run inside a sandboxed VM. Every value that crosses a function boundary — step arguments, step return values, workflow inputs — must be serializable. Plain objects, strings, numbers, and many built-in types (Date, Map, Set, RegExp, etc.) work automatically, but class instances that don't implement the custom class serialization protocol will throw a serialization error.

class StorageClient {
  constructor(private region: string) {}

  async upload(key: string, body: Uint8Array) {
    // ... uses this.region internally
  }
}

export async function processFile(client: StorageClient) {
  "use workflow";

  // client fails to serialize — StorageClient doesn't implement custom class serialization
  // The runtime throws a serialization error
  await uploadStep(client, "output.json", data);
}

Custom class serialization solves this by teaching the runtime how to convert your class instances to plain data and back.

The WORKFLOW_SERIALIZE / WORKFLOW_DESERIALIZE Protocol

The @workflow/serde package exports two symbols that act as a custom class serialization protocol. When the workflow runtime encounters a class instance with these symbols, it knows how to convert it to plain data and back.

import { WORKFLOW_SERIALIZE, WORKFLOW_DESERIALIZE } from "@workflow/serde";

class Point {
  constructor(public x: number, public y: number) {}

  distanceTo(other: Point): number {
    return Math.sqrt((this.x - other.x) ** 2 + (this.y - other.y) ** 2);
  }

  static [WORKFLOW_SERIALIZE](instance: Point) { 
    return { x: instance.x, y: instance.y };
  }

  static [WORKFLOW_DESERIALIZE](data: { x: number; y: number }) { 
    return new Point(data.x, data.y);
  }
}

Both methods must be static. WORKFLOW_SERIALIZE receives an instance and returns plain serializable data. WORKFLOW_DESERIALIZE receives that same data and reconstructs a new instance.

Both serialization methods run inside the workflow VM. They must not use Node.js APIs, non-deterministic operations, or network calls. Keep them focused on extracting and reconstructing data.

Automatic Class Registration

For the runtime to deserialize a class, the class must be registered in a global registry with a stable classId. The SWC compiler plugin handles this automatically — when it detects a class with both WORKFLOW_SERIALIZE and WORKFLOW_DESERIALIZE static methods, it generates registration code at build time.

This means you only need to implement the two symbol methods. The compiler assigns a deterministic classId based on the file path and class name, and registers it in the global Symbol.for("workflow-class-registry") registry.

No manual registration is required for classes defined in your workflow files. The SWC plugin detects the serialization symbols and generates the registration automatically at build time.

Full Example: A Workflow-Safe Storage Client

Here's a complete example of a storage client class that survives serialization across workflow boundaries. This pattern is useful when you need an object with methods to be passed as a workflow input or returned from a step.

import { WORKFLOW_SERIALIZE, WORKFLOW_DESERIALIZE } from "@workflow/serde";

interface StorageClientOptions {
  region: string;
  bucket: string;
  accessKeyId?: string;
  secretAccessKey?: string;
}

export class WorkflowStorageClient {
  private readonly region: string;
  private readonly bucket: string;
  private readonly accessKeyId?: string;
  private readonly secretAccessKey?: string;

  constructor(options: StorageClientOptions) {
    this.region = options.region;
    this.bucket = options.bucket;
    this.accessKeyId = options.accessKeyId;
    this.secretAccessKey = options.secretAccessKey;
  }

  async upload(key: string, body: Uint8Array) {
    "use step";
    const { S3Client, PutObjectCommand } = await import("@aws-sdk/client-s3");
    const client = new S3Client({
      region: this.region,
      credentials: this.accessKeyId
        ? { accessKeyId: this.accessKeyId, secretAccessKey: this.secretAccessKey! }
        : undefined,
    });
    await client.send(
      new PutObjectCommand({ Bucket: this.bucket, Key: key, Body: body })
    );
  }

  async getSignedUrl(key: string): Promise<string> {
    "use step";
    const { S3Client, GetObjectCommand } = await import("@aws-sdk/client-s3");
    const { getSignedUrl } = await import("@aws-sdk/s3-request-presigner");
    const client = new S3Client({ region: this.region });
    return getSignedUrl(client, new GetObjectCommand({ Bucket: this.bucket, Key: key }));
  }

  // --- Serialization protocol ---

  static [WORKFLOW_SERIALIZE](instance: WorkflowStorageClient): StorageClientOptions { 
    return {
      region: instance.region,
      bucket: instance.bucket,
      accessKeyId: instance.accessKeyId,
      secretAccessKey: instance.secretAccessKey,
    };
  }

  static [WORKFLOW_DESERIALIZE]( 
    data: StorageClientOptions
  ): WorkflowStorageClient {
    return new WorkflowStorageClient(data);
  }
}

Now this client can be passed into a workflow and used directly:

import { WorkflowStorageClient } from "./storage-client";

export async function processUpload(
  client: WorkflowStorageClient,
  data: Uint8Array
) {
  "use workflow";

  // client is a real WorkflowStorageClient with working methods
  await client.upload("output/result.json", data); 
  const url = await client.getSignedUrl("output/result.json"); 
  return { url };
}

Key APIs