Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Display full message in GRPC exception log #2164

Open
Prabha-Veerubhotla opened this issue Jul 27, 2023 · 13 comments
Open

Display full message in GRPC exception log #2164

Prabha-Veerubhotla opened this issue Jul 27, 2023 · 13 comments

Comments

@Prabha-Veerubhotla
Copy link

Prabha-Veerubhotla commented Jul 27, 2023

Feature Request

If this is a feature request, please fill out the following form in full:

Describe the problem the feature is intended to solve

While using Tensorflow serving, the exception message log is TRUNCATED.
Ex:

Caused by: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: xxxx...TRUNCATED

Describe the solution

Full logs should be displayed without any truncation.

Additional context

The Tensorflow serving client side config for max inbound message size
.maxInboundMessageSize()
is set to
int32max to match with server side config

builder.SetMaxMessageSize(tensorflow::kint32max);

The origination code seems to be the following as per stack trace:
https://github.com/grpc/grpc-java/blob/master/stub/src/main/java/io/grpc/stub/ClientCalls.java#L275

System information

  • TensorFlow Serving version: 2.11.0.5

Source code / logs

Prediction in Tensorflow serving:
https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/prediction_service.proto#L23

@singhniraj08
Copy link

@Prabha-Veerubhotla,

TF serving binary has C++ dependencies. C++ logging has a hard coded limit on log size 15K which results in this truncation of log messages. A typical workaround of this problem is to output every line as a separate record. This approach can rapidly increase memory usage which can result in crash due to a memory exceeded error.
Please let us know, if this is completely blocking you, we can try looking for alternatives. Thank you!

@Prabha-Veerubhotla
Copy link
Author

Thank you for looking into this @singhniraj08. I actually found that this is the reason for the log truncation

Do we have a command line param to have custom error message limit instead of the default 1024 .

@singhniraj08
Copy link

@Prabha-Veerubhotla,

Going through complete list of available command line flags, I couldn't find any flag or param to set custom error message limit. Thanks.

@Prabha-Veerubhotla
Copy link
Author

@singhniraj08 how about adding a new command line argument to set the custom error message limit here.

I am currently blocked with this as I cannot look at the complete error message. There are some features that are failing in TF serving.

@singhniraj08
Copy link

@Prabha-Veerubhotla,

Let us discuss this feature implementation internally and we will update this thread with updates. Thanks.

@Prabha-Veerubhotla
Copy link
Author

thank you @singhniraj08

@Prabha-Veerubhotla
Copy link
Author

hi @bmzhao is there an update on this ?

@Prabha-Veerubhotla
Copy link
Author

@singhniraj08 were you able to follow up on this ?

@Prabha-Veerubhotla
Copy link
Author

Happy to contribute with a pr if there is an agreement. We want this change to be part of 2.11 version.

@Prabha-Veerubhotla
Copy link
Author

@singhniraj08 , @bmzhao any update on this issue ?

@ndeepesh
Copy link

ndeepesh commented Sep 2, 2023

@singhniraj08 @bmzhao Is there an update to this issue?

@asamadiya
Copy link

@ndeepesh @Prabha-Veerubhotla @singhniraj08 @bmzhao
Can't we just do this? #2185

@Prabha-Veerubhotla
Copy link
Author

@asamadiya this should work. I am not sure if there will be any concern of printing a large message.
The original PR was part of an internal change from tensor flow team c49fd96

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants