Skip to content

Commit

Permalink
init
Browse files Browse the repository at this point in the history
  • Loading branch information
robert-zklink committed Apr 23, 2024
1 parent d7348fb commit 91f1219
Show file tree
Hide file tree
Showing 270 changed files with 39,336 additions and 1 deletion.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
node_modules
.env
dist
43 changes: 43 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
FROM node:18.17.1-alpine AS base-stage
ENV NODE_ENV=production

WORKDIR /usr/src/app

RUN apk add --update python3 make g++ && rm -rf /var/cache/apk/*

COPY --chown=node:node .npmrc .npmrc
COPY --chown=node:node lerna.json ./
COPY --chown=node:node package*.json ./
COPY --chown=node:node ./packages/worker/package*.json ./packages/worker/
RUN npm ci --ignore-scripts --only=production && npm cache clean --force
COPY --chown=node:node ./packages/worker/. ./packages/worker
RUN rm -f .npmrc

FROM base-stage AS development-stage
ENV NODE_ENV=development
COPY --chown=node:node .npmrc .npmrc
RUN npm ci
RUN rm -f .npmrc

FROM development-stage AS build-stage
RUN npm run build

FROM base-stage AS production-stage

# HEALTHCHECK --interval=30s --timeout=3s --retries=5 \
# CMD curl -f http://localhost:${PORT}/health || exit 1

COPY --chown=node:node --from=build-stage /usr/src/app/packages/worker/dist ./packages/worker/dist

ARG NODE_ENV=production
ENV NODE_ENV $NODE_ENV

ARG PORT=3001
ENV PORT $PORT

EXPOSE $PORT 9229 9230

USER node
WORKDIR /usr/src/app/packages/worker

CMD [ "node", "dist/main.js" ]
93 changes: 92 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,92 @@
# nova-point-v2
# zkSync Era Block Explorer Worker
## Overview

`zkSync Era Block Explorer Worker` is an indexer service for zkSync Era blockchain data. It retrieves aggregated data from the [Data Fetcher](/packages/data-fetcher) via HTTP and also directly from the blockchain using [zkSync Era JSON-RPC API](https://era.zksync.io/docs/api/api.html), processes it and saves into the database in a way that makes it easy to read by the [Block Explorer API](/packages/api).

## Installation

```bash
$ npm install
```

## Setting up env variables

- Create `.env` file in the `worker` package folder and copy paste `.env.example` content in there.
```
cp .env.example .env
```
- In order to tell the service where to get the blockchain data from set the value of the `BLOCKCHAIN_RPC_URL` env var to your blockchain RPC API URL. For zkSync Era testnet it can be set to `https://zksync2-testnet.zksync.dev`. For zkSync Era mainnet - `https://zksync2-mainnet.zksync.io`.
- To retrieve aggregated blockchain data for a certain block, the Worker service calls the [Data Fetcher](/packages/data-fetcher) service via HTTP. To specify Data Fetcher URL use `DATA_FETCHER_URL` env variable. By default, it is set to `http://localhost:3040` which is a default value for the local environment.
- Set up env variables for Postgres database connection. By default it points to `localhost:5432` and database name is `block-explorer`.
You need to have a running Postgres server, set the following env variables to point the service to your database:
- `DATABASE_HOST`
- `DATABASE_USER`
- `DATABASE_PASSWORD`
- `DATABASE_NAME`
- `DATABASE_CONNECTION_IDLE_TIMEOUT_MS`
- `DATABASE_CONNECTION_POOL_SIZE`

The service doesn't create database automatically, you can create database by running the following command:
```bash
$ npm run db:create
```

## Running the app

```bash
# development
$ npm run dev

# watch mode
$ npm run dev:watch

# debug mode
$ npm run dev:debug

# production mode
$ npm run start
```

## Test

```bash
# unit tests
$ npm run test

# unit tests debug mode
$ npm run test:debug

# e2e tests
$ npm run test:e2e

# test coverage
$ npm run test:cov
```

## Development

### Linter
Run `npm run lint` to make sure the code base follows configured linter rules.

### DB changes
Changes to the DB are stored as migrations scripts in `src/migrations` folder and are automatically executed on the application start.

We use _code first_ approach for managing DB schema so desired schema changes should be first applied to the Entity classes and then migrations scripts can be generated running the following command: `migration:generate`.

Example:

```
npm run migration:generate -name=AddStatusColumnToTxTable
```

a new migration with the specified name and all schema changes will be generated in `src/migration` folder. Always check generated migrations to confirm that they have everything you intended.

Sometimes you need to write a manual migration script not generated based on any schema changes. For instance, to run a script to update some records. In this case use `migration:create` to create an empty migration.

Example:

```
npm run migration:create -name=UpdateTxsFee
```

this command will simply create an empty migration where the custom migration logic can be added.
68 changes: 68 additions & 0 deletions diagrams/add-block.sequence.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
# Add block sequence diagram

```mermaid
sequenceDiagram
participant Worker
participant zkSync RPC API
participant Database
Worker->>Database: Get last block
activate Database
Database-->>Worker: Last block number
deactivate Database
loop For each block in block range to add
Worker->>zkSync RPC API: Get block and block details for block
activate zkSync RPC API
zkSync RPC API-->>Worker: Block and block details
deactivate zkSync RPC API
Worker->>Database: Save block
activate Database
Database-->>Worker: Database response
deactivate Database
loop For each transaction in block
Worker->>zkSync RPC API: Get transaction, transaction details and receipt
activate zkSync RPC API
zkSync RPC API-->>Worker: Transaction, transaction details and receipt
deactivate zkSync RPC API
Worker->>Database: Save transaction, receipt, logs and transfers
activate Database
Database-->>Worker: Database response
deactivate Database
alt Transaction has a receipt
Worker->>Database: Save contract addresses
activate Database
Database-->>Worker: Database response
deactivate Database
loop For each contract address
Worker->>zkSync RPC API: Get ERC20 token data by contract address
activate zkSync RPC API
zkSync RPC API-->>Worker: ERC20 token data
deactivate zkSync RPC API
Worker->>Database: Save token
activate Database
Database-->>Worker: Database response
deactivate Database
end
end
end
alt Block has no transactions
Worker->>zkSync RPC API: Get logs for block
activate zkSync RPC API
zkSync RPC API-->>Worker: Logs
deactivate zkSync RPC API
Worker->>Database: Save block logs and transfers
activate Database
Database-->>Worker: Database response
deactivate Database
end
loop For each affected address - token pair
Worker->>zkSync RPC API: Get balance
activate zkSync RPC API
zkSync RPC API-->>Worker: Balance
deactivate zkSync RPC API
Worker->>Database: Add balance
activate Database
Database-->>Worker: Database response
deactivate Database
end
end
```
33 changes: 33 additions & 0 deletions diagrams/batches-processing.flow.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# Process batches flow

## The following process runs for each batch state (Executed, Proven, Committed, New):

```mermaid
flowchart
SetCurrentState("Define batches to process state (currentSate = one of executed/proven/committed/new)") --> DeclareLastProcessedBatchNumberVar(Declare lastProcessedBatchNumber variable = NULL)
DeclareLastProcessedBatchNumberVar --> CheckIfLastProcessedBatchNumberIsNull{lastProcessedBatchNumber == NULL ?}
CheckIfLastProcessedBatchNumberIsNull --> |Yes| GetLastBatchFromDB(Get last batch number with state == currentState from DB)
GetLastBatchFromDB --> SetLastDBBatch(Set lastProcessedBatchNumber variable with last batch number from DB)
CheckIfLastProcessedBatchNumberIsNull --> |No| GetNextBatchFromBlockchain
SetLastDBBatch --> GetNextBatchFromBlockchain("Get the next batch from blockchain (lastProcessedBatchNumber + 1)")
GetNextBatchFromBlockchain --> CheckIfRequestSuccessful{Is request sucessful ?}
CheckIfRequestSuccessful --> |No| ResetLastDBBatch(Set lastProcessedBatchNumber = NULL)
CheckIfRequestSuccessful --> |Yes| CheckIfBatchExists{Does the next batch exist ?}
CheckIfBatchExists --> |No| ResetLastDBBatch(Set lastProcessedBatchNumber = NULL)
ResetLastDBBatch --> WaitFor1Minute(Wait for 1 minute)
WaitFor1Minute --> CheckIfLastProcessedBatchNumberIsNull
CheckIfBatchExists --> |Yes| CheckIfNextBatchHasTheSameState{Is the next batch state equal to currentState ?}
CheckIfNextBatchHasTheSameState --> |No| ResetLastDBBatch
CheckIfNextBatchHasTheSameState --> |Yes| UpsertBatchInDB(Instert or update the next batch in DB)
UpsertBatchInDB --> IncrementLastProcessedDBBatchNumber(Set lastProcessedBatchNumber = lastProcessedBatchNumber + 1)
IncrementLastProcessedDBBatchNumber --> CheckIfLastProcessedBatchNumberIsNull
```

### Batch state definition
Batch state is defined and used only internally. There are 4 batch states: `Executed`, `Proven`, `Committed` and `New`.
- `Executed` - batch has `executeTxHash` and `executedAt`.
- `Proven` - batch has `proveTxHash` and `provenAt`.
- `Committed` - batch has `commitTxHash` and `committedAt`.
- `New` - batch does't have any of `executeTxHash`, `proveTxHash` or `commitTxHash`.

Note, each `Executed` batch is also `Proven` and `Committed`, each `Proven` batch is also `Committed`.
30 changes: 30 additions & 0 deletions diagrams/worker.flow.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# Worker flow

```mermaid
flowchart
GetLastBlock[Get last block from blockchain] --> GetNextBlockRangeToAdd(Get next block range to add)
GetNextBlockRangeToAdd --> CheckBlockRangeToAddIsNull{Block range to add == NULL}
CheckBlockRangeToAddIsNull --> |Yes| WaitForNewBlocks(Wait for new blocks)
WaitForNewBlocks --> GetNextBlockRangeToAdd
CheckBlockRangeToAddIsNull --> |No| ForEachBlock[For each block in block range]
ForEachBlock --> CheckIfNoBlocksLeft{No blocks left?}
CheckIfNoBlocksLeft --> |Yes| GetNextBlockRangeToAdd
CheckIfNoBlocksLeft --> |No| GetBlockToAddDetails(Fetch and add i-th block to DB)
GetBlockToAddDetails --> ForEachTransaction[For each transaction in block]
ForEachTransaction --> CheckIfNoTransactionsLeft{No transactions left?}
CheckIfNoTransactionsLeft --> |Yes| CheckIfBlockHasNoTransactions{Block has no transactions?}
CheckIfNoTransactionsLeft --> |No| FetchAndAddTransaction(Fetch and add i-th transaction to DB)
FetchAndAddTransaction --> FetchAndAddTransactionReceipt(Fetch and add i-th transaction receipt to DB)
FetchAndAddTransactionReceipt --> SaveTransactionLogs(Save i-th transaction logs to DB)
SaveTransactionLogs --> SaveTransactionTransfers(Save i-th transaction transfers to DB)
SaveTransactionTransfers --> CheckIfTransactionReceiptExists{i-th transaction receipt exists?}
CheckIfTransactionReceiptExists --> |No| CheckIfNoTransactionsLeft
CheckIfTransactionReceiptExists --> |Yes| SaveContractAddresses(Save contracts addresses to DB)
SaveContractAddresses --> SaveERC20Tokens(Save ERC20 tokens)
SaveERC20Tokens --> CheckIfNoTransactionsLeft
CheckIfBlockHasNoTransactions --> |No| SaveBalances
CheckIfBlockHasNoTransactions --> |Yes| FetchBlockLogs(Fetch block logs)
FetchBlockLogs --> SaveTransfers(Save transfers to DB)
SaveTransfers --> SaveBalances(Save balances)
```
78 changes: 78 additions & 0 deletions migrationScripts/checkAddressBalances.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
const { Client } = require("pg");
const { Contract, BigNumber } = require("ethers");
const zksync = require("zksync-web3");

const connectionString = process.env.DATABASE_URL;
const provider = new zksync.Provider(process.env.BLOCKCHAIN_RPC_URL);
const batchSize = parseInt(process.env.BATCH_SIZE, 10) || 1000;

const fromHex = (buffer) => {
return `0x${buffer.toString("hex")}`;
}

const toDbHexStr = (value) => {
return value.startsWith("0x") ? value.substring(2) : value;
}

const getBalance = async (address, tokenAddress = zksync.utils.ETH_ADDRESS) => {
if (zksync.utils.isETH(tokenAddress)) {
return await provider.getBalance(address, "latest");
}

const erc20Contract = new Contract(tokenAddress, zksync.utils.IERC20, provider);
return await erc20Contract.balanceOf(address, { blockTag: "latest" });
};

const getUpdateBalanceScript = async (balanceRecord) => {
const address = fromHex(balanceRecord.address);
const tokenAddress = fromHex(balanceRecord.tokenAddress);
let balance = null;
try {
balance = await getBalance(address, tokenAddress);
} catch (e) {
if (!(e.code === 'CALL_EXCEPTION' && e.method === 'balanceOf(address)' && !!e.transaction &&
(e.message && e.message.startsWith("call revert exception")))) {
return "";
}
}
if (balance && balance.eq(BigNumber.from(0))) {
return `UPDATE "addressBalances" SET "toDelete" = TRUE, checked = TRUE, "latestBalance" = '${balance.toString()}' WHERE address = decode('${toDbHexStr(address)}', 'hex') AND "tokenAddress" = decode('${toDbHexStr(tokenAddress)}', 'hex');`;
} else {
return `UPDATE "addressBalances" SET checked = TRUE, "latestBalance" = '${balance ? balance.toString() : null}' WHERE address = decode('${toDbHexStr(address)}', 'hex') AND "tokenAddress" = decode('${toDbHexStr(tokenAddress)}', 'hex');`;
}
};

const getNextRecordsBatch = async (pgClient) => {
let balances = await pgClient.query(`SELECT * FROM "addressBalances" WHERE checked = FALSE LIMIT ${batchSize};`);
return balances;
};

const main = async () => {
const client = new Client(connectionString);
await client.connect()
let batchNum = 0;
let balances = await getNextRecordsBatch(client);
while (balances && balances.rows.length) {
console.log(`Processing items ${batchNum * batchSize} - ${(batchNum + 1) * batchSize - 1}`);
batchNum += 1;
console.log('Getting balances:')
console.log(new Date());
const updateScripts = await Promise.all(balances.rows.map(balanceRecord => getUpdateBalanceScript(balanceRecord)));
console.log(new Date());
console.log('Updating DB:')
console.log(new Date());
await client.query(updateScripts.join(""));
console.log(new Date());
balances = await getNextRecordsBatch(client);
}
};

main()
.then(() => {
console.log("Done");
process.exit(0);
})
.catch((e) => {
console.error(e);
process.exit(0);
});
Loading

0 comments on commit 91f1219

Please sign in to comment.