Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Maximum concurrent instance limit of 32 reached #2509

Open
blakelukas opened this issue Oct 4, 2023 · 5 comments
Open

Maximum concurrent instance limit of 32 reached #2509

blakelukas opened this issue Oct 4, 2023 · 5 comments

Comments

@blakelukas
Copy link

Hello! We have the following error with the Moonriver node. Could you please assist us in resolving this issue?
Regarding this request:

curl --data '{"method":"trace_filter","params":[{"fromBlock":"0x4f6a8d","toBlock":"0x4f867f"}],"id":1,"jsonrpc":"2.0"}' -H "Content-Type: application/json" -X POST http://localhost:8545

We receive the following error:

{"jsonrpc":"2.0","error":{"code":-32603,"message":"Failed to replay block. Error : \"Runtime api access error: Application(Execution(RuntimeConstruction(Other(\\\"failed to instantiate a new WASM module instance: maximum concurrent instance limit of 32 reached\\\"))))\""},"id":1}

Logs at the moment of sending the request

2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 [🌗] Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 Ran out of free WASM instances
2023-10-04 06:40:19 [🌗] ✨ Imported #5248594 (0xd72f…3a8e)

purestake/moonbeam-tracing:v0.33.0-2501-1998

"--chain=moonriver",
              "--base-path=/data",
              "--name=test",
              "--unsafe-rpc-external",
              "--rpc-cors=*",
              "--rpc-port=8545",
              "--rpc-max-response-size=100",
              "--rpc-max-connections=100000",
              "--ethapi-max-permits=1000",
              "--execution=wasm",
              "--wasm-execution=compiled",
              "--state-pruning=archive",
              "--wasm-runtime-overrides=/moonbeam/moonriver-substitutes-tracing",
              "--trie-cache-size=0",
              "--ethapi=debug,trace,txpool",
              "--ethapi-trace-max-count=10000",
              "--runtime-cache-size=32",
              "--db-cache=66000"

We have already tried to resync the node and look for some kind of parameter that can adjust concurrent instance limit, but everything has failed.

@blakelukas
Copy link
Author

blakelukas commented Oct 11, 2023

We have also attempted to adjust the max-runtime-instances parameter, but this did not help.

Current configuration:

              "--chain=moonriver",
              "--base-path=/data",
              "--name=test",
              "--unsafe-rpc-external",
              "--rpc-cors=*",
              "--rpc-port=8545",
              "--rpc-max-response-size=100",
              "--rpc-max-connections=100000",
              "--ethapi-max-permits=1000",
              "--execution=wasm",
              "--wasm-execution=compiled",
              "--state-pruning=archive",
              "--max-runtime-instances=64",
              "--wasm-runtime-overrides=/moonbeam/moonriver-substitutes-tracing",
              "--trie-cache-size=0",
              "--ethapi=debug,trace,txpool",
              "--ethapi-trace-max-count=10000",
              "--runtime-cache-size=64",
              "--db-cache=30000"

logs:

2023-10-11 07:38:05 [Relaychain] 💤 Idle (13 peers), best: #20061188 (0x412f…fe15), finalized #20061186 (0x2f37…3dc2), ⬇ 288.4kiB/s ⬆ 698.0kiB/s    
2023-10-11 07:38:05 [🌗] 💤 Idle (21 peers), best: #5297435 (0x53a0…a51a), finalized #5297434 (0xb4a6…346f), ⬇ 0.1kiB/s ⬆ 0.1kiB/s    
2023-10-11 07:38:06 [Relaychain] ✨ Imported #20061189 (0xc070…84fa)    
2023-10-11 07:38:06 [🌗] ✨ Imported #5297437 (0x0c18…82c7)    
2023-10-11 07:38:07 Accepting new connection 48/100000
2023-10-11 07:38:08 Accepting new connection 48/100000
2023-10-11 07:38:10 [Relaychain] 💤 Idle (14 peers), best: #20061189 (0xc070…84fa), finalized #20061186 (0x2f37…3dc2), ⬇ 157.7kiB/s ⬆ 229.9kiB/s    
2023-10-11 07:38:10 [🌗] 💤 Idle (21 peers), best: #5297436 (0xdbe5…c093), finalized #5297434 (0xb4a6…346f), ⬇ 10.0kiB/s ⬆ 33.9kiB/s    
2023-10-11 07:38:12 [Relaychain] ✨ Imported #20061190 (0x0a9f…2e03)    
2023-10-11 07:38:13 Accepting new connection 48/100000
2023-10-11 07:38:15 [Relaychain] 💤 Idle (14 peers), best: #20061190 (0x0a9f…2e03), finalized #20061187 (0xdf92…f79f), ⬇ 375.5kiB/s ⬆ 967.7kiB/s    
2023-10-11 07:38:15 [🌗] 💤 Idle (21 peers), best: #5297436 (0xdbe5…c093), finalized #5297435 (0x53a0…a51a), ⬇ 7.0kiB/s ⬆ 3.3kiB/s    
2023-10-11 07:38:16 [🌗] A request asked a pooled block (0x4e9a…6a8e), adding it to the list of waiting requests.
2023-10-11 07:38:17 Accepting new connection 48/100000
2023-10-11 07:38:17 Accepting new connection 39/100000
2023-10-11 07:38:17 Accepting new connection 24/100000
2023-10-11 07:38:17 Accepting new connection 17/100000
2023-10-11 07:38:17 Accepting new connection 18/100000

As you can see, there is a new line: A request asked a pooled block (0x4e9a…6a8e), adding it to the list of waiting requests. , but the error remains the same..

@crystalin
Copy link
Collaborator

It seems the runtime execution is stuck or never cleared.
Do you have an idea what could be the reason @bkchr ? Those are using wasm substitutes (to enable tracing) but this was never a problem before.

@bkchr
Copy link

bkchr commented Oct 11, 2023

Should probably be fixed by: paritytech/polkadot-sdk#1856

@HarukiMcCree
Copy link

Hey!

We've upgraded to the v0.34.0. We use the same request and still get the same error mentioned in the 1st comment by @blakelukas
Generally speaking, the only thing changed after upgrade is the number (32 --> 64):

{
  "jsonrpc": "2.0",
  "error": {
    "code": -32603,
    "message": "Failed to replay block. Error : \"Blockchain error when replaying block 5204621 : Application(Execution(RuntimeConstruction(Other(\\\"failed to instantiate a new WASM module instance: maximum concurrent instance limit of 64 reached\\\"))))\""
  },
  "id": 1
}

Logs are pretty the same, no new messages noticed:

2023-11-25 22:55:25 [Relaychain] 💤 Idle (9 peers), best: #20717100 (0x3140…1154), finalized #20717097 (0x6aad…2647), ⬇ 353.0kiB/s ⬆ 438.7kiB/s    
2023-11-25 22:55:25 [Relaychain] ♻️  Reorg on #20717100,0x3140…1154 to #20717100,0x7ae1…1f8f, common ancestor #20717099,0x408f…569b    
2023-11-25 22:55:25 [Relaychain] ✨ Imported #20717100 (0x7ae1…1f8f)    
2023-11-25 22:55:25 Accepting new connection 39/100000
2023-11-25 22:55:25 Accepting new connection 40/100000
2023-11-25 22:55:26 [🌗] A request asked a cached block (0x4e9a…6a8e), sending the traces directly.
2023-11-25 22:55:26 Ran out of free WASM instances
2023-11-25 22:55:26 Ran out of free WASM instances
2023-11-25 22:55:26 Ran out of free WASM instances
2023-11-25 22:55:26 Ran out of free WASM instances
2023-11-25 22:55:26 Ran out of free WASM instances
2023-11-25 22:55:26 Ran out of free WASM instances

Do you have any ideas on this?

@albertov19
Copy link
Contributor

Hey @HarukiMcCree - The upstream fix was not included in Moonbeam client v0.34.0 - It should be included in the next client. @crystalin Do you believe with the next client the issue @HarukiMcCree reported should be fixed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants