Skip to main content

How do you store snapshots in an Amazon S3 bucket?

Summary​

The default Recorder class stores objects in memory. You can implement a custom Recorder class that writes snapshots to an S3 bucket to improve the Recorder's functionality.

In this tutorial, you'll learn how to implement a custom Recorder that reads and writes snapshots from an S3 bucket to enable time-travel debugging.

Prerequisites​

We recommend you complete the Getting Started section before implementing a custom Recorder class.

Step 1: Implementing required methods​

A Recorder class consists of four methods, of which three are optional.

interface Recorder {
/**
* Records a snapshot. This function is called by AppEngine after EVERY state dispatch.
* It is up to you to decide where to store this Snapshot, if at all.
*/
afterDispatch(snapshot: Snapshot): Promise<void>;

/** Returns all recorded snapshots. */
getSnapshots?(): Promise<Snapshot[]>;

/**
* Omit this function to disable explicit snapshot creation
* It is up to you to decide where to store this Snapshot.
*/
createSnapshot?(snapshot: Snapshot): Promise<void>;

/** Loads a snapshot. Omit this function if you do not want persistence */
loadSnapshot?(id?: string): Promise<Snapshot | null>;
}

Below is the skeleton for your class for this example. Note that the constructor expects a file name for the snapshot you want to store and an S3 bucket name.

customRecorder.ts
import { S3Download, S3Upload } from './s3Helper';

export class S3Recorder {
snapshot: any = { action: null, state: null };
file: string = '';
bucket: string = '';
timer: any;

constructor(file: string, bucket: string) {
this.file = file;
this.bucket = bucket;
}

async afterDispatch(snapshot: any) {

}

async createSnapshot(snapshot: any) {

}

async loadSnapshot() {

}
}

Next, let's take a quick look at the S3 helper functions. You'll use the S3Download and S3Upload functions to implement the skeleton methods for the S3Recorder. The other functions exposed by this S3 helper file can be used to build more advanced S3 recording functionality.

Make sure to replace the AWS_ENDPOINT, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and REGION with your bucket and access point details.

s3Helper.ts
import {
CreateBucketCommand,
DeleteObjectCommand,
GetObjectCommand,
HeadBucketCommand,
HeadObjectCommand,
PutObjectCommand,
S3Client,
} from '@aws-sdk/client-s3';

export const s3 = new S3Client({
endpoint: process.env.AWS_ENDPOINT, // Only needed for VPS access point
region: process.env.REGION,
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID as string,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY as string,
},
});

export async function S3CreateBucketIfNotExists(Bucket: string) {
try {
await s3.send(
new CreateBucketCommand({
Bucket: Bucket,
}),
);
return true;
} catch (err: any) {
if (err.Code === 'BucketAlreadyOwnedByYou') {
return true;
}
if (err.Code === 'BucketAlreadyExists') {
return true;
}
throw err;
}
}

export async function S3BucketExists(Bucket: string) {
const options = {
Bucket: Bucket,
};
try {
await s3.send(new HeadBucketCommand(options));
return true;
} catch (error: any) {
if (error.statusCode === 404) {
return false;
}
throw error;
}
}

export async function S3Upload(Bucket: string, Key: string, Body: any): Promise<void> {
await s3.send(
new PutObjectCommand({
Bucket: Bucket,
Key: Key,
Body: Body,
}),
);
}

export async function S3Download(Bucket: string, Key: string) {
return s3.send(
new GetObjectCommand({
Bucket: Bucket,
Key: Key,
}),
);
}

export async function S3Delete(Bucket: string, Key: string): Promise<any> {
return s3.send(
new DeleteObjectCommand({
Bucket: Bucket,
Key: Key,
}),
);
}

export async function S3Exists(Bucket: string, Key: string): Promise<boolean> {
try {
await s3.send(
new HeadObjectCommand({
Bucket: Bucket,
Key: Key,
}),
);
return true;
} catch (err: any) {
if (err.Code === 'NotFound') {
return false;
}
throw err;
}
}

Step 2: Adding the afterDispatch() logic​

The afterDispatch function is triggered each time the MintblueReader dispatches an Action object. This allows you to create snapshots and store them to enable time-travel debugging.

The first snapshot it receives is the initial state, which is empty. In this case, you want to do nothing and return. The next lines of code call the createSnapshot function to store it in S3.

customRecorder.ts
 async afterDispatch(snapshot: any) {
if (!snapshot.action) {
// this is the first invocation, contains initialstate
return;
}
this.snapshot = snapshot;
// Save a permanent snapshot after 5 seconds of inactivity
if (this.timer) {
clearTimeout(this.timer);
}
this.timer = setTimeout(() => {
this.createSnapshot(snapshot);
}, 5000);
}

We recommend uploading a new snapshot only after five seconds of inactivity to avoid pushing a new snapshot to your bucket for each of the potentially thousands of entries during synchronization with the application state.

Step 3: Implementing createSnapshot()​

Next, you want to upload a snapshot to your S3 bucket each time the createSnapshot() function is called.

The JSON object is stringified and uploaded to your bucket. Note that you must add the bucket and file name for your snapshot. Here, we are using our S3 helper function S3Upload.

customRecorder.ts
async createSnapshot(snapshot: any) {
console.log('CREATING SNAPSHOT', snapshot.action.id);
const data = JSON.stringify(snapshot, null, 2);
await S3Upload(this.bucket, this.file, new TextEncoder().encode(data));
}

Step 4: Implementing loadSnapshot()​

After you've created a new snapshot, you want to load your snapshot to enable time-travel debugging. The loadSnapshot() function calls the S3Download helper to retrieve your snapshot file.

customRecorder.ts
async loadSnapshot() {
try {
const res = await S3Download(this.bucket, this.file);
if (res.Body) {
const snap = JSON.parse(await res.Body.transformToString());
return snap;
} else {
return null;
}
} catch (err) {
console.error("Error reading the snapshot file:", err);
process.exit(1)
}
}

How do you use a custom Recorder?​

To use your Recorder, import it and create a new instance of the S3Recorder class. In the example below, we've added a filename forest.json to store our snapshots and an S3 bucket named mintblue.

main.ts
import { S3Recorder } from './fileRecorder.js';

const recorder = new S3Recorder('forest.json', 'mintblue');

const machine = new Machine(
Forest,
await MintblueReader.create(sdkToken, project_id),
await CustomMintblueWriter.create(sdkToken, project_id),
recorder, // Custom recorder
);

When interacting with the machine, like planting trees and starting it (as covered in Getting Started, it will trigger the afterDispatch function and store the snapshot in your S3 bucket.

Code Check ✅​

import { S3Download, S3Upload } from './s3Helper';

export class S3Recorder {
snapshot: any = { action: null, state: null };
file: string = '';
bucket: string = '';
timer: any;

constructor(file: string, bucket: string) {
this.file = file;
this.bucket = bucket;
}

async afterDispatch(snapshot: any) {
if (!snapshot.action) {
// this is the first invocation, contains empty initialstate
return;
}
this.snapshot = snapshot;

// Save a permanent snapshot after 5 seconds of inactivity
if (this.timer) {
clearTimeout(this.timer);
}
this.timer = setTimeout(() => {
this.createSnapshot(snapshot);
}, 5000);
}

async getSnapshot() {
return this.snapshot;
}

async createSnapshot(snapshot: any) {
console.log('CREATING SNAPSHOT', snapshot.action.id);
const data = JSON.stringify(snapshot, null, 2);
await S3Upload(this.bucket, this.file, new TextEncoder().encode(data));
}

async loadSnapshot() {
try {
const res = await S3Download(this.bucket, this.file);
if (res.Body) {
const snap = JSON.parse(await res.Body.transformToString());
return snap;
} else {
return null;
}
} catch (err) {
console.error("Error reading the snapshot file:", err);
process.exit(1)
}
}
}