Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature(frameworks): bentoml.onnx runner accept kwargs #3561

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

larme
Copy link
Member

@larme larme commented Feb 14, 2023

What does this PR address?

Sometimes it's more nature to call onnx model with keyword arguments. For example, bert's tokenizer will output a dictionary, which the original model will call with model(**input). Our runner should simulate this behavior

Fixes #(issue)

Before submitting:

@larme larme requested a review from a team as a code owner February 14, 2023 00:02
@larme larme requested review from parano and removed request for a team February 14, 2023 00:02
@codecov
Copy link

codecov bot commented Feb 14, 2023

Codecov Report

Merging #3561 (607ec46) into main (afe9660) will not change coverage.
The diff coverage is 0.00%.

Impacted file tree graph

@@          Coverage Diff          @@
##            main   #3561   +/-   ##
=====================================
  Coverage   0.00%   0.00%           
=====================================
  Files        154     154           
  Lines      12620   12632   +12     
=====================================
- Misses     12620   12632   +12     
Impacted Files Coverage Δ
src/bentoml/_internal/frameworks/onnx.py 0.00% <0.00%> (ø)

aarnphm
aarnphm previously approved these changes Feb 14, 2023
@aarnphm
Copy link
Member

aarnphm commented Feb 14, 2023

cc @larme, I ran the test locally, it passed, but it still didn't run on CI. can you also fix it as well?

return _process_output(raw_outs)
if options.use_kwargs_inputs:

def _run(self: ONNXRunnable, **kwargs: ONNXArgType) -> t.Any:
Copy link
Member

@bojiang bojiang Feb 16, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this mean we can not call onnx_runner.run(arg1, arg2, kwarg1=xxx), using args and kwargs at same time?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no I don't think so.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it does.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For every onnx model we have a fixed list of inputs with their own names. Calling by args means refer each argument by its position while calling by kwargs means refer each argument by its name. Now if we mix these 2 methods, I feel that it's very easy for user to miss some arguments.

Calling by kwargs is useful when dealing with NLP models where the tokenizer output a dictionary and the dictionary is passed to the model as kwargs. In this use case there's no additional positional argument to be passed

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need more discussion about this?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's have a discussion on engineer meeting

@larme
Copy link
Member Author

larme commented Feb 21, 2023

@aarnphm Now the tests run on CI. The problem is that the new bert tests will missing the pytorch dependencies (I thought installing transformers will install torch by default, I was wrong).

aarnphm
aarnphm previously approved these changes Feb 21, 2023
@sauyon
Copy link
Contributor

sauyon commented Aug 30, 2023

@larme @aarnphm what's the status of this one?

@aarnphm
Copy link
Member

aarnphm commented Aug 30, 2023

This has been on this list for a while now @larme we should circle back and get this one merge

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants