Skip to main content

Quickstart: generate from ABI

The abi template generates a ready-to-use squid from an EVM contract ABI. The squid decodes and indexes the EVM logs and transactions of the contract into a local Postgres database. Additionally, it serves the indexed data with a rich GraphQL API supporting pagination and filtering.

Pre-requisites

Before getting to work on your very first squid, verify that you have installed the following software:

  • Node v16.x or newer
  • Squid CLI v2.1.0 or newer
  • Docker
info

Earlier versions of the template were based on Makefile. The new version uses @subsquid/commands scripts, defined in commands.json which are automatically recognized as sqd sub-commands.

Please note:

  • The squid template is not compatible with yarn. Use npm instead.

Step 1: Scaffold from a template

Come up with a new memorable name for your squid and scaffold from squid-abi-template using sqd init:

sqd init my-awesome-squid --template abi
cd my-awesome-squid
# install the dependencies
npm ci

Step 2: Generate and build the squid

  • Consult the EVM configuration page and choose an archive endpoint from the list of supported EVM networks.
  • Prepare the contract ABI and save it into the assets folder, e.g. as assets/abi.json.
info

For public contracts the ABI can be fetched automatically using an Etherscan-like API. To do so simply omit the --abi flag. Check out

sqd generate --help

for a full list of supported options.

Generate and build the squid with

sqd generate \
--address <address> \
--abi assets/abi.json \
--archive <network archive alias or endpoint URL> \
--event '*' \
--function '*' \
--from <starting block>

sqd build

Example

sqd generate \
--address 0x6B175474E89094C44Da98b954EedeAC495271d0F \
--abi assets/abi.json \
--archive eth-mainnet \
--event '*' \
--function '*' \
--from 1000000

sqd build

Step 3: Launch Postgres in a detached Docker container

sqd up

Step 4: Generate the schema migrations

sqd migration:generate

Step 5: Run the squid processor

Run the processor with

sqd process

The squid now ingests the contract transactions and event log data, decodes it and stores it in the database.

Step 6: Start the GraphQL server

In a separate terminal window, run

sqd serve
# in yet another window
sqd open http://localhost:4350/graphql

This starts a GraphQL server serving the indexed events and transactions from the local database. The GraphQL playground is available at http://localhost:4350/graphql. Open it in a browser and run sample queries by applying filters and data selections in the panel to the left.

query MyQuery {
events(limit: 10) {
id
name
}
}

Step 7: Customize

Hack the schema file schema.graphql and the processor src/processor.ts to index the data your way!

What's next?