Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: torch 2.0 #3682

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

feat: torch 2.0 #3682

wants to merge 1 commit into from

Conversation

aarnphm
Copy link
Member

@aarnphm aarnphm commented Mar 17, 2023

This PR aims to bring Torch 2.0 support to save_model via torch.compile

@codecov
Copy link

codecov bot commented Mar 17, 2023

Codecov Report

Merging #3682 (7ab3e55) into main (bcc10ac) will decrease coverage by 0.04%.
The diff coverage is 0.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #3682      +/-   ##
==========================================
- Coverage   31.74%   31.71%   -0.04%     
==========================================
  Files         149      149              
  Lines       12149    12162      +13     
  Branches     2001     2003       +2     
==========================================
  Hits         3857     3857              
- Misses       8008     8021      +13     
  Partials      284      284              
Impacted Files Coverage Δ
src/bentoml/_internal/frameworks/pytorch.py 0.00% <0.00%> (ø)

Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
@aarnphm aarnphm marked this pull request as ready for review March 17, 2023 14:08
@aarnphm aarnphm requested a review from a team as a code owner March 17, 2023 14:08
@aarnphm aarnphm requested review from larme and removed request for a team March 17, 2023 14:08
Comment on lines +47 to +55
class ModelOptions(PartialKwargsModelOptions):
fullgraph: bool = False
dynamic: bool = False
backend: t.Union[str, t.Callable[..., t.Any]] = "inductor"
mode: t.Optional[str] = None
options: t.Optional[t.Dict[str, t.Union[str, int, bool]]] = None
disable: bool = False


Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First, let’s call this PytorchOptions and only when import to bentoml.pytorch we rename it to ModelOptions.

Second, I think maybe it’s better to have single compile_kwargs dict in PytorchOptions instead of polluting the name space of PytorchOptions. Maybe we will have only 2 entries:

  • enable_compile
  • compile_kwargs

def load_model(
bentoml_model: str | Tag | Model,
device_id: t.Optional[str] = "cpu",
**compile_kwargs: t.Any,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don’t think we need do compile at load_model level. Let’s just save the original model and do torch.compile when init the runner (if user set enable_compile=True)

Comment on lines +211 to +223
opts = t.cast(ModelOptions, bento_model.info.options)
if get_pkg_version("torch") >= "2.0.0":
_load_model = partial(
load_model,
fullgraph=opts.fullgraph,
dynamic=opts.dynamic,
backend=opts.backend,
mode=opts.mode,
options=opts.options,
disable=opts.disable,
)
else:
_load_model = load_model
Copy link
Member

@larme larme Mar 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can just model = load_model(…), and then torch.compile(model, **compile_kwargs) if enable_compile is True.

@bojiang bojiang self-requested a review March 23, 2023 06:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants