Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Export :tensorflow:serving:... metrics by signature names #1959

Open
jeongukjae opened this issue Jan 3, 2022 · 4 comments · May be fixed by #2152
Open

Export :tensorflow:serving:... metrics by signature names #1959

jeongukjae opened this issue Jan 3, 2022 · 4 comments · May be fixed by #2152

Comments

@jeongukjae
Copy link

jeongukjae commented Jan 3, 2022

Feature Request

If this is a feature request, please fill out the following form in full:

Describe the problem the feature is intended to solve

For now, tensorflow serving exports metrics by model like below.

...
:tensorflow:serving:request_count{model_name="test_model",status="OK"} 6
...
:tensorflow:serving:request_latency_bucket{model_name="test_model",API="predict",entrypoint="REST",le="10"} 0
:tensorflow:serving:request_latency_bucket{model_name="test_model",API="predict",entrypoint="REST",le="18"} 0
...
:tensorflow:serving:runtime_latency_bucket{model_name="test_model",API="Predict",runtime="TF1",le="10"} 0
:tensorflow:serving:runtime_latency_bucket{model_name="test_model",API="Predict",runtime="TF1",le="18"} 0
:tensorflow:serving:runtime_latency_bucket{model_name="test_model",API="Predict",runtime="TF1",le="32.4"} 0
...

We cannot collect metrics by signatures, even if the latencies of each signature are very different.

Related codes:

Describe the solution

It must be better if runtime latency and request latency are recorded with signature names.

Describe alternatives you've considered

Additional context

@sanatmpa1 sanatmpa1 self-assigned this Jan 3, 2022
@sanatmpa1 sanatmpa1 assigned bmzhao and unassigned sanatmpa1 Jan 3, 2022
@singhniraj08 singhniraj08 assigned nniuzft and unassigned bmzhao Feb 17, 2023
@singhniraj08 singhniraj08 self-assigned this Jun 8, 2023
@singhniraj08
Copy link

@jeongukjae,

Are you still looking for a resolution? We are planning on prioritising the issues based on the community interests. Please let us know if this issue still persists with the latest TF Serving 1.12.1 release so that we can work on fixing it. Thank you for your contributions.

@jeongukjae
Copy link
Author

@singhniraj08 I wrote a PR for this issue #2152
I think those patches are enough for this. Can you review that?

@singhniraj08
Copy link

singhniraj08 commented Jun 16, 2023

@jeongukjae, Thank you for your contributions. We will discuss this internally and update this thread. Thanks

@jeongukjae
Copy link
Author

@singhniraj08 Thank you.

+

I wrote another issue that is similar to this issue: #2157
Can you discuss that issue too internally?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants