Skip to main content
Version: Firesquid

Create an EVM-processing Squid

Objective

This tutorial will take the Squid EVM template and go through all the necessary steps to customize the project, in order to interact with a different Squid Archive, synchronized with a different blockchain, and process data from two different contracts (AstarDegens and AstarCats), instead of the one used in the template.

The business logic to process these contract is basic, and that is on purpose since the Tutorial aims show a simple case, highlighting the changes a developer would typically apply to the template, removing unnecessary complexity.

The blockchain used in this example will be the Astar network and the final objective will be to show the tokens that are part of these smart contracts, who owns them and every time they have been transfered.

Pre-requisites

The minimum requirements to follow this tutorial are the basic knowledge of software development, such as handling a Git repository, a correctly set up Development Environment, basic command line knowledge and the concepts explained in this documentation.

Fork the template

The first thing to do, although it might sound trivial to GitHub experts, is to fork the repository into your own GitHub account, by visiting the repository page and clicking the Fork button:

How to fork a repository on GitHub

Next, clone the created repository (be careful of changing <account> with your own account)

git clone git@github.com:<account>/squid-evm-template.git

For reference on the complete work, you can find the entire project here.

Run the project

Next, just follow the Quickstart to get the project up and running, here's a list of commands to run in quick succession:

npm ci
npm run build
docker compose up -d
npx squid-typeorm-migration apply
node -r dotenv/config lib/processor.js
# open a separate terminal for this next command
npx squid-graphql-server

Bear in mind this is not strictly necessary, but it is always useful to check that everything is in order. If you are not interested, you could at least get the Postgres container running with docker compose up -d.

Define Entity Schema

The next thing to do, in order to customize the project for our own purpose, is to make changes to the schema and define the Entities we want to keep track of.

Luckily, the EVM template already contains a schema that defines the exact entities we need for the purpose of this guide. For this reason, changes are necessary, but it's still useful to explain what is going on.

To index ERC-721 token transfers, we will need to track:

  • Token transfers
  • Ownership of tokens
  • Contracts and their minted tokens

And the schema.graphql file defines them like shis:

type Token @entity {
id: ID!
owner: Owner
uri: String
transfers: [Transfer!]! @derivedFrom(field: "token")
contract: Contract
}

type Owner @entity {
id: ID!
ownedTokens: [Token!]! @derivedFrom(field: "owner")
balance: BigInt
}

type Contract @entity {
id: ID!
name: String
symbol: String
totalSupply: BigInt
mintedTokens: [Token!]! @derivedFrom(field: "contract")
}

type Transfer @entity {
id: ID!
token: Token!
from: Owner
to: Owner
timestamp: BigInt!
block: Int!
transactionHash: String!
}

It's worth noting a couple of things in this schema definition:

  • @entity - signals that this type will be translated into an ORM model that is going to be persisted in the database
  • @derivedFrom - signals the field will not be persisted on the database, it will rather be derived
  • type references (i.e. from: Owner) - establishes a relation between two entities

The template already has automatically generated TypeScript classes for this schema definition. They can be found under src/model/generated.

Whenever changes are made to the schema, new TypeScript entity classes have to be generated, and to do that you'll have to run the codegen tool:

npx squid-typeorm-codegen

ABI Definition and Wrapper

Subsquid offers support for automatically building TypeScript type-safe interfaces for Substrate data sources (events, extrinsics, storage items). Changes are automatically detected in the runtime.

This functionality has been extended to EVM indexing, with the release of a evm-typegen tool to generate TypeScript interfaces and decoding functions for EVM logs.

Once again, the template repository already includes interfaces for ERC-721 contracts, which is the subject of this guide. But it is still important to explain what needs to be done, in case, for example, one wants to index a different type of contract.

First, it is necessary to obtain the definition of its Application Binary Interface (ABI). This can be obtained in the form of a JSON file, which will be imported into the project.

  1. It is advisable to copy the JSON file in the src/abi subfolder.
  2. To automatically generate TypeScript interfaces from an ABI definition, and decode event data, simply run this command from the project's root folder
npx squid-evm-typegen --abi src/abi/ERC721.json --output src/abi/erc721.ts

The abi parameter points at the JSON file previously created, and the output parameter is the name of the file that will be generated by the command itself.

This command will automatically generate a TypeScript file named erc721.ts, under the src/abi subfolder, that defines data interfaces to represent output of the EVM events defined in the ABI, as well as a mapping of the functions necessary to decode these events (see the events dictionary in the aforementioned file).

info

The ERC-721 ABI defines the signatures of all events in the contract. The Transfer event has three arguments, named: from, to, and tokenId. Their types are, respectively, address, address, and uint256. As such, the actual definition of the Transfer event looks like this: Transfer(address, address, uint256).

Define and Bind Event Handler(s)

The Subsquid SDK provides users with a processor class, named SubstrateProcessor or, in this specific case SubstrateBatchProcessor. The processor connects to the Subsquid archive to get chain data. It will index from the configured starting block, until the configured end block, or until new data is added to the chain.

The processor exposes methods to "attach" functions that will "handle" specific data such as Substrate events, extrinsics, storage items, or EVM logs. These methods can be configured by specifying the event or extrinsic name, or the EVM log contract address, for example. As the processor loops over the data, when it encounters one of the configured event names, it will execute the logic in the "handler" function.

Managing the EVM contract

It is worth pointing out, at this point, that some important auxiliary code like constants and helper functions to manage the EVM contract is defined in the src/contracts.ts file. Here's a summary of what is in it:

  • Define the chain node endpoint (optional but useful)
  • Create a contract interface(s) to store information such as the address and ABI
  • Define functions to fetch or create contract entities from the database and the contract URI from the ethers instance
  • Define a couple of functions to avoid that the connection generated by the ethers instance will stall

In order to adapt the template to the scope of this guide, we need to apply a couple of changes:

  1. edit the CHAIN_NODE constant to the endpoint URL of Astar network (e.g. wss://astar.api.onfinality.io/public-ws)
  2. define a map that relates the contract address with the contract model and the Contract instance defined by the ethers library
  3. edit the hexadecimal address used to create the contract constant (we are going to use this token for the purpose of this guide)
  4. change the name, symbol and totalSupply values used in the createContractEntity function to their correct values (see link in the previous point)
  5. create a second ethers Contract for the second ERC721 token contract we want to index and a second entry in the map

In case someone wants to index an EVM event different from Transfer, they would also have to implement a different handler function from processTransfer, especially the line where the event "Transfer(address,address,uint256)" is decoded.

// src/contract.ts
import { Store } from "@subsquid/typeorm-store";
import { ethers } from "ethers";
import * as erc721 from "./abi/erc721";
import { Contract } from "./model";

export const CHAIN_NODE = "wss://astar.public.blastapi.io";

interface ContractInfo {
ethersContract: ethers.Contract;
contractModel: Contract;
}

export const contractMapping: Map<string, ContractInfo> = new Map<
string,
ContractInfo
>();

export const astarDegenscontract = new ethers.Contract(
"0xd59fC6Bfd9732AB19b03664a45dC29B8421BDA9a".toLowerCase(),
erc721.abi,
new ethers.providers.WebSocketProvider(CHAIN_NODE)
);

contractMapping.set(astarDegenscontract.address, {
ethersContract: astarDegenscontract,
contractModel: {
id: astarDegenscontract.address,
name: "AstarDegens",
symbol: "DEGEN",
totalSupply: 10000n,
mintedTokens: [],
},
});

export const astarCatsContract = new ethers.Contract(
"0x8b5d62f396Ca3C6cF19803234685e693733f9779".toLowerCase(),
erc721.abi,
new ethers.providers.WebSocketProvider(CHAIN_NODE)
);

contractMapping.set(astarCatsContract.address, {
ethersContract: astarCatsContract,
contractModel: {
id: astarCatsContract.address,
name: "AstarCats",
symbol: "CAT",
totalSupply: 7777n,
mintedTokens: [],
},
});

export function createContractEntity(address: string): Contract {
return new Contract(contractMapping.get(address)?.contractModel);
}

const contractAddresstoModel: Map<string, Contract> = new Map<
string,
Contract
>();

export async function getContractEntity(
store: Store,
address: string
): Promise<Contract | undefined> {
if (contractAddresstoModel.get(address) == null) {
let contractEntity = await store.get(Contract, address);
if (contractEntity == null) {
contractEntity = createContractEntity(address);
await store.insert(contractEntity);
contractAddresstoModel.set(address, contractEntity)
}
}

return contractAddresstoModel.get(address);
}

export async function getTokenURI(
tokenId: string,
address: string
): Promise<string> {
return retry(async () =>
timeout(contractMapping.get(address)?.ethersContract?.tokenURI(tokenId))
);
}

async function timeout<T>(res: Promise<T>, seconds = 30): Promise<T> {
return new Promise((resolve, reject) => {
let timer: any = setTimeout(() => {
timer = undefined;
reject(new Error(`Request timed out in ${seconds} seconds`));
}, seconds * 1000);

res
.finally(() => {
if (timer != null) {
clearTimeout(timer);
}
})
.then(resolve, reject);
});
}

async function retry<T>(promiseFn: () => Promise<T>, attempts = 3): Promise<T> {
for (let i = 0; i < attempts; i += 1) {
try {
return await promiseFn();
} catch (err) {
console.log(err);
}
}
throw new Error(`Error after ${attempts} attempts`);
}

Configure Processor and Attach Handler

The src/processor.ts file is where the template project instantiates the SubstrateBatchProcessor class, configures it for execution, and attaches the handler functions.

Luckily for us, most of the job is already done, but we still need to adapt the code to handle two contracts, instead of only one. We need to change the addEvmLog function call, with the appropriate contract address for AstarDegens, and add a second one for AstarCats.

Furthermore, we need to adapt the logic to save the Token to avoid clashing.

info

It is also important to note that, since the template was built for the moonriver network, it is necessary to change the archive parameter of the setDataSource function to fetch the Archive URL for Astar. The lookupArchive function is used to consult the archive registry and yield the archive address, given a network name. Network names should be in lowercase.

Look at this code snippet for the end result:

// src/processor.ts
import { lookupArchive } from "@subsquid/archive-registry";
import { Store, TypeormDatabase } from "@subsquid/typeorm-store";
import {
BatchContext,
BatchProcessorItem,
EvmLogEvent,
SubstrateBatchProcessor,
SubstrateBlock,
} from "@subsquid/substrate-processor";
import { In } from "typeorm";
import {
CHAIN_NODE,
astarDegenscontract,
getContractEntity,
getTokenURI,
astarCatsContract,
contractMapping,
} from "./contract";
import { Owner, Token, Transfer } from "./model";
import * as erc721 from "./abi/erc721";

const database = new TypeormDatabase();
const processor = new SubstrateBatchProcessor()
.setBatchSize(500)
.setBlockRange({ from: 442693 })
.setDataSource({
chain: CHAIN_NODE,
archive: lookupArchive("astar", { release: "FireSquid" }),
})
.setTypesBundle("astar")
.addEvmLog(astarDegenscontract.address, {
range: { from: 442693 },
filter: [erc721.events["Transfer(address,address,uint256)"].topic],
})
.addEvmLog(astarCatsContract.address, {
range: { from: 800854 },
filter: [erc721.events["Transfer(address,address,uint256)"].topic],
});

type Item = BatchProcessorItem<typeof processor>;
type Context = BatchContext<Store, Item>;

processor.run(database, async (ctx) => {
const transfersData: TransferData[] = [];

for (const block of ctx.blocks) {
for (const item of block.items) {
if (item.name === "EVM.Log") {
const transfer = handleTransfer(block.header, item.event);
transfersData.push(transfer);
}
}
}

await saveTransfers(ctx, transfersData);
});

type TransferData = {
id: string;
from: string;
to: string;
token: string;
timestamp: bigint;
block: number;
transactionHash: string;
contractAddress: string;
};

function handleTransfer(
block: SubstrateBlock,
event: EvmLogEvent
): TransferData {
const { from, to, tokenId } = erc721.events[
"Transfer(address,address,uint256)"
].decode(event.args);

const transfer: TransferData = {
id: event.id,
token: tokenId.toString(),
from,
to,
timestamp: BigInt(block.timestamp),
block: block.height,
transactionHash: event.evmTxHash,
contractAddress: event.args.address,
};

return transfer;
}

async function saveTransfers(ctx: Context, transfersData: TransferData[]) {
const tokensIds: Set<string> = new Set();
const ownersIds: Set<string> = new Set();

for (const transferData of transfersData) {
tokensIds.add(transferData.token);
ownersIds.add(transferData.from);
ownersIds.add(transferData.to);
}

const transfers: Set<Transfer> = new Set();

const tokens: Map<string, Token> = new Map(
(await ctx.store.findBy(Token, { id: In([...tokensIds]) })).map((token) => [
token.id,
token,
])
);

const owners: Map<string, Owner> = new Map(
(await ctx.store.findBy(Owner, { id: In([...ownersIds]) })).map((owner) => [
owner.id,
owner,
])
);

for (const transferData of transfersData) {
let from = owners.get(transferData.from);
if (from == null) {
from = new Owner({ id: transferData.from, balance: 0n });
owners.set(from.id, from);
}

let to = owners.get(transferData.to);
if (to == null) {
to = new Owner({ id: transferData.to, balance: 0n });
owners.set(to.id, to);
}

let token = tokens.get(`${contractMapping.get(transferData.contractAddress)?.contractModel.symbol || ""}-${transferData.token}`);
if (token == null) {
token = new Token({
id: `${contractMapping.get(transferData.contractAddress)?.contractModel.symbol || ""}-${transferData.token}`,
uri: await getTokenURI(transferData.token, transferData.contractAddress),
contract: await getContractEntity(ctx.store, transferData.contractAddress),
});
tokens.set(token.id, token);
}
token.owner = to;

const { id, block, transactionHash, timestamp } = transferData;

const transfer = new Transfer({
id,
block,
timestamp,
transactionHash,
from,
to,
token,
});

transfers.add(transfer);
}

await ctx.store.save([...owners.values()]);
await ctx.store.save([...tokens.values()]);
await ctx.store.save([...transfers]);
}

info

Pay close attention to the line with id in the Token model, because this is how we avoid the two token collection to clash. Both are using cardinal numbers to identify their own tokens, but we are now adding them to the same table, so we need a way to identify them uniquely and in this case, we chose the contract symbol to do so.

Launch and Set Up the Database

When running the project locally, as it is the case for this guide, it is possible to use the docker-compose.yml file that comes with the template to launch a PostgreSQL container. To do so, run the following command in your terminal:

docker-compose up -d

Launch database container

!!! Note The -d parameter is optional, it launches the container in daemon mode, so the terminal will not be blocked, and no further output will be visible.

Squid projects automatically manage the database connection and schema, via an ORM abstraction.

To set up the database, you can take the following steps:

  1. Build the code

    npm run build
  2. Make sure the Postgres Docker container, squid-evm-template_db_1, is running

    docker ps -a
  3. Apply the migration, so tables are created on the database

```bash
npx squid-typeorm-migration apply
```

Launch the Project

To launch the processor (this will block the current terminal), you can run the following command:

node -r dotenv/config lib/processor.js

Launch processor

Finally, in a separate terminal window, launch the GraphQL server:

npx squid-graphql-server

Visit localhost:4350/graphql to access the GraphiQl console. From this window, you can perform queries such as this one, to find out the account owners with the biggest balances:

query MyQuery {
owners(limit: 10, where: {}, orderBy: balance_DESC) {
balance
id
}
}

Or this other one, looking up the tokens owned by a given owner:

query MyQuery {
tokens(where: {owner: {id_eq: "0x1210F3eA18Ef463c162FFF9084Cee5B6E5ccAb37"}}) {
uri
contract {
id
name
symbol
totalSupply
}
}
}

Have some fun playing around with queries, after all, it's a playground!