Skip to content

Tuning in graph node for speeding up subgraph's synchronization #3756

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Jacob273 opened this issue Jul 21, 2022 · 8 comments
Open

Tuning in graph node for speeding up subgraph's synchronization #3756

Jacob273 opened this issue Jul 21, 2022 · 8 comments

Comments

@Jacob273
Copy link
Contributor

Jacob273 commented Jul 21, 2022

What is the expected behavior?

I'd like to speed up subgraphs synchronization process.

What is the current behavior?

At the moment I am syncing two subgragraphs and they're syncing slowly:

  • uniswapV3
  • uniswapV2

Both are syncing with around 50 blocks per minute. So it is around 3 000 blocks per hour and that gives 72 000~ blocks per day.
(I've measured that by querying transaction table => got the latest transaction row and its block_number , waited 60 seconds and again => retrieved latest transaction row and its block_number)

I have approx:

  • 4 703 996 blocks to sync for uniswapV2 left (ETA 65 days?)
  • 2 671 345 blocks to sync for uniswapV3 left (ETA 37 days?)

I have dedicated server which runs on SSD and alot of ram available (30GB unused ram~).
I'm using ERIGON client as local ethereum node via json-rpc on 8545 and graph-node:v0.25.2.
Both postgres and graph-node are hosted within same docker container via docker-compose.

Are there any possibilities to speed up the synchronization process?

So far I've tried to increase ETHEREUM_BLOCK_BATCH_SIZE from (defaults 10) to 500 but that did not speed anything up.
I've also seen that by default there's GRAPH_LOG: info in docker-compose, which I assume may be switched off (at least until subgraphs will get fully sync)?

I've observed that graph-node consumes just a a little of RAM - around 300 MB~.

@Jacob273
Copy link
Contributor Author

Jacob273 commented Jul 21, 2022

I've tried following (can't see the difference).

  GRAPH_LOG: error
  ETHEREUM_RPC_MAX_PARALLEL_REQUESTS: 256
  ETHEREUM_BLOCK_BATCH_SIZE: 500
  ETHEREUM_POLLING_INTERVAL: 100
  ETHEREUM_TRACE_STREAM_STEP_SIZE: 500
  GRAPH_ETHEREUM_TARGET_TRIGGERS_PER_BLOCK_RANGE: 1000
  GRAPH_ETHEREUM_MAX_BLOCK_RANGE_SIZE: 8000
  GRAPH_ETHEREUM_MAX_EVENT_ONLY_RANGE: 5000

Is there any article regarding speeding up the sync process so that server is pushed to the limits?

@oliver-g-alexander
Copy link

Also interested

@wangqiang-h
Copy link

HI, have you find some ways to speed up the graph node, I have the same problem with you , Thanks

@rdonmez
Copy link

rdonmez commented Nov 17, 2022

Also interested

@Jacob273
Copy link
Contributor Author

Jacob273 commented Nov 21, 2022

I haven't found any other solution to speed up the process except for the way that consits of placing more erigon/geth nodes and making each geth-node point to each (e.g uniswapV2 pointing to erigon1, uniswapV3 pointing to erigon2)

@azf20
Copy link
Contributor

azf20 commented Nov 21, 2022

hey @Jacob273 are you still using 0.25.2, or the more recent 0.28.2? (there are some enhancements in more recent versions)

@Jacob273
Copy link
Contributor Author

Jacob273 commented Nov 21, 2022

Yes I've seen several enhacements in release notes e.g in 0.27, e.g:

  • Store writes are now carried out in parallel to the rest of the subgraph process
  • GRAPH_STORE_WRITE_QUEUE env variable.

but I couldn't observe any block/s improvement after updating from 0.25.2 to 0.27 (but that may be due to improper configuration).

In our case some of the subgraphs are running on graph-node 0.25.2 and some on 0.27 (not really remember why some are still runnin gon 0.25.2).

In our case satisfying speed we've reached was:

  • Sushiswap: [1.9-2.5] blocks per second.
  • UniswapV2: [1-1.5] blocks per second.
  • UniswapV3: [1.2-1.8] blocks per second.

@tw7613781
Copy link

Also interested

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants