New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: [benchmark][cluster] Milvus sets up two replicas, and the concurrent test raise error: StatusCode.DEADLINE_EXCEEDED, Deadline Exceeded #17289
Comments
|
This seems to be an old issue of pulsar: apache/pulsar#5284 |
/unassign |
if it is not 100% reproduce and recoverable, degrade priority |
This An FAQ for this rare case will be added to the documentation. Users are suggested to restart the pulsar cluster when encountering this fence issue. One more thing to mention is that we could increase the number of topics in configuration. A larger pool of physical channels is expected to lower the probability of this issue. Line 99 in 70f8bea
|
We shouldn't increase the dmlChannel number to very large because topic/partition is a limited resources to Kafka/Pulsar. Currently, we don't create any physical channel once the cluster is up. If we know that's not the case, change the consumer to shared might be a solution but it's a little bit tricky and dangerous |
@wangting0128 what pulsar version are we using? 2.8.2 should fix exactly the same issue. |
not reproduce recently. degrade the priority |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Is there an existing issue for this?
Environment
Current Behavior
argo task: benchmark-backup-ds5f4
test yaml:
client-configmap:client-random-locust-search-filter-100m-ddl-r8-w2-replica2
server-configmap:server-cluster-8c64m-querynode2
server:
client pod: benchmark-backup-ds5f4-3449886322
client log:
Expected Behavior
No response
Steps To Reproduce
Milvus Log
No response
Anything else?
client-random-locust-search-filter-100m-ddl-r8-w2-replica2:
The text was updated successfully, but these errors were encountered: