How I implement debank feed API without using extra event log storage
Hey guys. I am a blockchain backend developer who worked for a Web3 API SaaS Company. Today I am going to illustrate How I implement debank feed API
without using database to store extra event logs data except an erigon
node.
Preprations
- erigon node with completely full logs (not a archive node is all right) and with no limitations of
block range
in request params. - database that stores the transaction
from
address andto
address
Difficulties
Let’s start to analyze the difficuties and the problems need to be solved before we start to implement this API.
-
API Data need Presented in reverse order. In
Debank
/Zerion
/Zapper
, there is always showing the latest logs or transactions first. But in common implementation ofEthereum
node, logs are returned in increasing order and so as the way you requested. -
It’s difficult to union
from
topic andto
topic in a single request. But if you want to implement the API, you need to find out not only tokens the address sent but also tokens that address received. You can easily requestfrom
topic andto
topic separately using following request:
curl --request POST \
--url 'https://rpc.ankr.com/eth' \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--data '{
"id":1,
"jsonrpc":"2.0",
"method":"eth_getLogs",
"params":[
{
"fromBlock":"0x10a4b01",
"toBlock":"0x10a4b01",
"topics":[
[],
[
"0x000000000000000000000000d8da6bf26964af9d7eed9e03e53415d37aa96045"
]
]
}
]
}'
But sometimes you do not care about the location of the specific topic, you just care about if the topic exists in that logs.
-
Hard to find the block number when you just want to fetch the latest. In most of the cases, when you want to look into an account address you do not know the block number the transaction happened. So you will search it using Etherscan or other toolkit but it would make it less effective.
-
Pagination. It’s hard to fetch page in the cursor way using current API.
To Solve problems above, I implement a new API erigon_getLatestLog in erigon node. I will elaborate on how this API is designed in another article. In this article, I mainly describe How to use this API. Simply put, this API implements the following functions.
It will return specific number of logs or block matching a give filter objects by descend and it provide you an option called IgnoreTopicsOrder . By setting the option true, logs with topics are return without caring the topics orders. In bloomfilter, calculating Or operation of bits is more effective in using SQL Or so I would like to put this operation in API.
The API has following parameters:
ethereum.FilterQuery : The filter query is the same to eth_getLogs .
IgnoreTopicsOrde: When IgnoreTopicsOrde option is true, once the logs have a topic that matched, it will be returned no matter what topic position it is in.
blockCount : The block size that match the filter query.
With the API, we can get debank feed easily.
Here’s the core part of the implementation.
// client call the `erigon_getLatestLogs` method
func (c *Client) DebankFeed(ctx context.Context, endBlockCursor *big.Int, accountAddress common.Address) ([]types.Log, error) {
var logs []types.Log
query := ethereum.FilterQuery{
FromBlock: new(big.Int).SetUint64(0),
Topics: [][]common.Hash{
{
accountAddress.Hash(),
},
},
}
// cursor way to fetch
if endBlockCursor != nil {
query.ToBlock = endBlockCursor
}
err := c.client.CallContext(ctx, &logs, "erigon_getLatestLogs", query, Query{
// logs size where it stops the iteration
LogCount: 20,
// ignore the location of the topic
IgnoreTopicsOrder: true,
})
return logs, err
}
Once we have the original log feed, we can start to resolve the logs data.
// Resolve data and do some aggregated calculation jobs
func main() {
cli, err := client.NewClient(context.Background(), "http://localhost:8545")
if err != nil {
fmt.Println(err.Error())
return
}
account := common.HexToAddress("0xd8da6bf26964af9d7eed9e03e53415d37aa96045")
vitalikLogs, err := cli.DebankFeed(context.Background(), nil, account)
if err != nil {
fmt.Println(err.Error())
return
}
fmt.Printf("length of vitalik logs: %d", len(vitalikLogs))
// token balance changes in the same transaction
tokenChanges := make(map[common.Hash]map[common.Address]*big.Int)
for _, l := range vitalikLogs {
switch {
case erc20.FilterERC20Transfer(l):
if tokenChanges[l.TxHash] == nil {
tokenChanges[l.TxHash] = make(map[common.Address]*big.Int)
}
event, err := cli.Erc20Filter.ParseTransfer(l)
if err != nil {
fmt.Println(err.Error())
return
}
if tokenChanges[l.TxHash][l.Address] == nil {
tokenChanges[l.TxHash][l.Address] = new(big.Int).SetUint64(0)
}
// calculate the aggregated result
if event.From == account {
tokenChanges[l.TxHash][l.Address].Sub(tokenChanges[l.TxHash][l.Address], event.Value)
}
if event.To == account {
tokenChanges[l.TxHash][l.Address].Add(tokenChanges[l.TxHash][l.Address], event.Value)
}
case erc721.FilterERC721Transfer(l):
if tokenChanges[l.TxHash] == nil {
tokenChanges[l.TxHash] = make(map[common.Address]*big.Int)
}
event, err := cli.Erc721Filter.ParseTransfer(l)
if err != nil {
fmt.Println(err.Error())
return
}
if tokenChanges[l.TxHash][l.Address] == nil {
tokenChanges[l.TxHash][l.Address] = new(big.Int).SetUint64(0)
}
// calculate the aggregated result
if event.From == account {
tokenChanges[l.TxHash][l.Address].Sub(tokenChanges[l.TxHash][l.Address], new(big.Int).SetUint64(1))
}
if event.To == account {
tokenChanges[l.TxHash][l.Address].Add(tokenChanges[l.TxHash][l.Address], new(big.Int).SetUint64(1))
}
// TODO Weth Aggregated Logic
}
}
}
Hope you like this feature in erigon. More features will be added in erigon v3 to help blockchain better.
All the codes above are open-source and can be found on https://github.com/fenghaojiang/debank-feed.
Any suggestions, comments (including criticisms) and contributions are welcome.
Plz contract me via Github Or email fenghaojiang97@gmail.com.