Create a simple Squid
This page is about taking the Squid template and customizing it to create a simple project

Objective

This tutorial will take the Squid template and go through all the necessary steps to customize the project, in order to interact with a different Squid Archive, synchronized with a different blockchain, and process data from Events different from the ones in the template.
The business logic to process such Events is very basic, and that is on purpose since the purpose of the Tutorial is to show a simple case, highlighting the changes a developer would typically apply to the template, removing unnecessary complexity.
The blockchain used in this example will be the Crust storage network and the final objective will be to observe which files have been added and deleted from the chain, as well as groups joined and storage orders placed by a determined account.

Pre-requisites

The minimum requirements to follow this tutorial are the basic knowledge of software development, such as handling a Git repository, a correctly set up Development Environment, basic command line knowledge and the concepts explained in this documentation.

Fork the template

The first thing to do, although it might sound trivial to GitHub experts, is to fork the repository into your own GitHub account, by visiting the repository page and clicking the Fork button:
How to fork a repository on GitHub
Next, clone the created repository (be careful of changing <account> with your own account)
1
git clone [email protected]:<account>/squid-template.git
Copied!
For reference on the complete work, you can find the entire project here.

Run the project

Next, just follow the Quickstart to get the project up and running, here's a list of commands to run in quick succession:
1
npm ci
2
npm run build
3
docker compose up -d
4
npx sqd db create
5
npx sqd db migrate
6
node -r dotenv/config lib/processor.js
7
# open a separate terminal for this next command
8
npx squid-graphql-server
Copied!
Bear in mind this is not strictly necessary, but it is always useful to check that everything is in order. If you are not interested, you could at least get the Postgres container running with docker compose up -d.

Install new dependencies

For this specific project, we will need to install a new dependency since the type definitions for the Crust blockchain are implemented in the @crustio/types-definition package.
Navigate to the repository's root window in a command line console and install it.
1
npm i @crustio/type-definitions
Copied!
Install Crust type definition

Define Entity Schema

The next thing to do, in order to customize the project for our own purpose, is to make changes to the schema and define the Entities we want to keep track of.
Because we said we want to track
  • files added and deleted from the chain
  • groups joined by a certain account
  • storage orders placed by a certain account
We are going to make these changes to our schema.graphql:
schema.graphql
1
type Account @entity {
2
id: ID! #Account address
3
workReports: [WorkReport] @derivedFrom(field: "account")
4
joinGroups: [JoinGroup] @derivedFrom(field: "member")
5
storageOrders: [StorageOrder] @derivedFrom (field: "account")
6
}
7
8
type WorkReport @entity {
9
id: ID! #event id
10
account: Account!
11
addedFiles: [[String]]
12
deletedFiles: [[String]]
13
extrinsicId: String
14
createdAt: DateTime!
15
blockHash: String!
16
blockNum: Int!
17
}
18
19
type JoinGroup @entity {
20
id: ID!
21
member: Account!
22
owner: String!
23
extrinsicId: String
24
createdAt: DateTime!
25
blockHash: String!
26
blockNum: Int!
27
}
28
29
type StorageOrder @entity {
30
id: ID!
31
account: Account!
32
fileCid: String!
33
extrinsicId: String
34
createdAt: DateTime!
35
blockHash: String!
36
blockNum: Int!
37
}
Copied!
It's worth noticing that the Account entity is almost completely derived and it is there to tie the other three entities together, since Groups are joined by an Account, Storage Orders are placed by an Account and Work Reports, show files added and changed by, you guessed it, an Account!
This all requires some implicit knowledge of the blockchain itself (here's a tip on how to obtain this information).
To finalize this step, it is necessary to run the codegen tool, to generate TypeScript Entity classes for our schema definition:
1
npx sqd codegen
Copied!

Generate TypeScript interfaces

The process to generate wrappers around TypeScript wrappers around Events and Extrinsics has a dedicated page to explain it and a quick Recipe to guide you through it, so it is advised to consult them for more information.

Chain exploration

What matters in the context of this tutorial, is to pay attention to the chain, archive and out parameters, which refer to the related WebSocket address of the Crust blockchain, the Squid Archive synchronized with it (this is optional, but helps speed up the process) and the output file simply contains the chain name as a good naming convention (this is useful in case of multiple chains handled in the same project or folder).
1
npx squid-substrate-metadata-explorer \
2
--chain wss://rpc-crust-mainnet.decoo.io \
3
--archive https://crust.indexer.gc.subsquid.io/v4/graphql \
4
--out crustVersions.json
Copied!
The output is visible in the crustVersions.json file, and although the metadata field is intelligible, it's worth noting that there are 13 different versions, meaning the Runtime has changed 13 times.
It remains to be seen if this had any impacts on the definitions of the Events we are interested in.

Types bundle

One peculiar thing about the Crust chain and this example is that, at the moment of writing of this guide, its types have not been integrated into Squid's library.
This gives us a good opportunity to follow this mini-guide and create an example, extracting a types bundle from crust's own library, to Subsquid required format.
Update: the "crust" types bundle has been added to the list of built-ins, but for learning purposes, it's still useful to see how to create and use a types bundle JSON file.
Here is the end result, copy it and paste it into a file named crustTypesBundle.json

Events wrappers generation

Next, we need to make a few changes in the typegen.json configuration file, to adapt it to our purposes. We want to specify the same JSON file used as output in the previous command (in this case, crustVersions.json), then we need to specify the events that we are interested in, for this project.
Similar to what's been said in the previous chapter, this requires knowledge of the blockchain itself and some research might be required, but in the case of this example, the events are:
  • WorksReportSuccess from the swork pallet
  • JoinGroupSuccess from the same pallet
  • FileSuccess from the market pallet
typegen.json
1
{
2
"outDir": "src/types",
3
"chainVersions": "crustVersions.json",
4
"typesBundle": "crustTypesBundle.json",
5
"events": [
6
"swork.WorksReportSuccess",
7
"swork.JoinGroupSuccess",
8
"market.FileSuccess"
9
],
10
"calls": []
11
}
Copied!
And finally, run the command to generate type-safe TypeScript wrappers around the metadata
1
npx squid-substrate-typegen typegen.json
Copied!
The end result is in the src/types/events.ts file (because we only defined Events in our typegen.json) and should look something like this.

Define and bind Event Handlers

After having obtained wrappers for Events and the metadata changes across different Runtime versions, it's finally time to define Handlers for these Events and attach them to our Processor, and this is done in the src/processor.ts file in the project folder.
First of all, we need to import the generated Entity model classes, in order to be able to use them in our code. And then, we need the type definitions of Crust events, so that they can be used to wrap them. So let's add these two lines at the top of our file:
1
import {Account, WorkReport, JoinGroup, StorageOrder} from './model/generated'
2
import { MarketFileSuccessEvent, SworkJoinGroupSuccessEvent, SworkWorksReportSuccessEvent } from './types/events'
Copied!
Then, we need to customize the processor, by giving it the right name, connecting it to the right Squid Archive and setting the correct types. This is done by substituting with the following code the top part of the file, that looks similar to it.
1
const processor = new SubstrateProcessor('crust_example')
2
processor.setDataSource({
3
archive: 'https://crust.indexer.gc.subsquid.io/v4/graphql',
4
chain: 'wss://rpc-crust-mainnet.decoo.io'
5
});
6
processor.setBlockRange({from: 583000}); // this is the starting block for exploring the chain, please don't mind it.
7
processor.setTypesBundle(crustTypes);
Copied!
Next, because the added and deleted files are matrices, we are going to declare a function to handle that, for our own convenience. Simply add this code to the src/processor.ts file, anywhere.
1
function stringifyArray(list: any[]): any[] {
2
let listStr : any[] = [];
3
list = list[0]
4
for (let vec of list){
5
for (let i = 0; i < vec.length; i++){
6
vec[i] = String(vec[i]);
7
}
8
listStr.push(vec);
9
}
10
return listStr
11
}
Copied!
Now, we are going to take a different approach from the template and define event handlers as functions, and then add bind them to the processor via the processor.addEventHandler() function call.
Here are the declarations for the Event handler functions, same as above, add this code somewhere in the file.
1
async function joinGroupSuccess(ctx: EventHandlerContext): Promise<void> {
2
let event = new SworkJoinGroupSuccessEvent(ctx);
3
const memberId = String(event.asLatest[0]);
4
const account = await getOrCreate(ctx.store, Account, memberId);
5
const joinGroup = new JoinGroup();
6
7
joinGroup.id = ctx.event.id;
8
joinGroup.member = account;
9
joinGroup.owner = String(event.asLatest[1]);
10
joinGroup.blockHash = ctx.block.hash;
11
joinGroup.blockNum = ctx.block.height;
12
joinGroup.createdAt = new Date(ctx.block.timestamp);
13
joinGroup.extrinsicId = ctx.extrinsic?.id;
14
15
await ctx.store.save(account);
16
await ctx.store.save(joinGroup);
17
}
18
19
async function fileSuccess(ctx: EventHandlerContext): Promise<void> {
20
let event = new MarketFileSuccessEvent(ctx);
21
const accountId = String(event.asLatest[0]);
22
const account = await getOrCreate(ctx.store, Account, accountId);
23
const storageOrder = new StorageOrder();
24
25
storageOrder.id = ctx.event.id;
26
storageOrder.account = account;
27
storageOrder.fileCid = String(event.asLatest[1]);
28
storageOrder.blockHash = ctx.block.hash;
29
storageOrder.blockNum = ctx.block.height;
30
storageOrder.createdAt = new Date(ctx.block.timestamp);
31
storageOrder.extrinsicId = ctx.extrinsic?.id;
32
33
await ctx.store.save(account);
34
await ctx.store.save(storageOrder);
35
36
}
37
38
async function workReportSuccess(ctx: EventHandlerContext): Promise<void> {
39
let event = new SworkWorksReportSuccessEvent(ctx);
40
const accountId = String(event.asLatest[0]);
41
const accountPr = getOrCreate(ctx.store, Account, accountId);
42
const addedFilesObjPr = ctx.extrinsic?.args.find(arg => arg.name === "addedFiles");
43
const deletedFilesObjPr = ctx.extrinsic?.args.find(arg => arg.name === "deletedFiles");
44
const [account,addFObj,delFObj] = await Promise.all([accountPr,addedFilesObjPr,deletedFilesObjPr]);
45
46
const workReport = new WorkReport();
47
48
workReport.addedFiles = stringifyArray(Array(addFObj?.value))
49
workReport.deletedFiles = stringifyArray(Array(delFObj?.value))
50
if ((workReport.addedFiles.length > 0) || (workReport.deletedFiles.length > 0))
51
{ workReport.account = account;
52
53
workReport.id = ctx.event.id;
54
workReport.blockHash = ctx.block.hash;
55
workReport.blockNum = ctx.block.height;
56
workReport.createdAt = new Date(ctx.block.timestamp);
57
workReport.extrinsicId = ctx.extrinsic?.id;
58
59
await ctx.store.save(account);
60
await ctx.store.save(workReport);
61
}
62
}
Copied!
Lastly, as mentioned earlier, we are going to get rid of the previous event handler, by replacing the code responsible for tying an anonymous function to the processor, with our own:
1
processor.addEventHandler('market.FileSuccess', fileSuccess);
2
processor.addEventHandler('swork.JoinGroupSuccess', joinGroupSuccess);
3
processor.addEventHandler('swork.WorksReportSuccess', workReportSuccess);
Copied!
Here is the end result, in case you missed something
A repository with the entire project is also available on GitHub. If you like it, please leave a

Apply changes to the Database

Squid project automatically manages the database connection and schema, via an ORM abstraction. As such, we need to use the provided automated tools to manage the database schema and migrations.

Remove default migration

First, we need to get rid of the template's default migration:
1
rm -rf db/migrations/*.js
Copied!
Then, make sure the Postgres docker container is running, in order to have a database to connect to, and run the following commands:
1
npx sqd db drop
2
npx sqd db create
3
npx sqd db create-migration Init
4
npx sqd db migrate
Copied!
These will, in order:
  1. 1.
    drop the current database
  2. 2.
    create a new database
  3. 3.
    create the initial migration, by looking up the schema we defined in the previous chapter
  4. 4.
    apply the migration
Drop the database, re-create it, generate a migration and apply it

Launch the project

It's finally time to run the project. First of all, let's build the code
1
npm run build
Copied!
And then launch the processor (this will block the current terminal)
1
node -r dotenv/config lib/processor.js
Copied!
Launch the GraphQL server (in a separate command line console window)
1
npx squid-graphql-server
Copied!
And see the results for ourselves the result of our hard work, by visiting the localhost:4350/graphql URL in a browser and accessing the GraphiQl console.
From this window, we can perform queries such as this one, to find which files have been added or deleted by an account:
1
query AccountFiles{
2
accountById(id: <accountID>) {
3
workReports {
4
addedFiles
5
deletedFiles
6
}
7
}
8
}
Copied!
It is advisable to search for an Account first and grab its ID.

Credits

This sample project is actually a real integration, developed by our very own Mikhail Shulgin. Credits for building it and helping with the guide go to him.