Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose the Onnx runtime option for setting the number of threads #5962

Merged
merged 5 commits into from Oct 12, 2021

Conversation

yaeldekel
Copy link

@yaeldekel yaeldekel commented Oct 7, 2021

When this option is not specified, onnxruntime determines the value from the OS, but when running in a Docker container, the number of threads is determined based on the number of cores on the machine, instead of the number of cores allocated to the container, which causes a slow down in the performance.

bool fallbackToCpu = false,
int? interOpNumThreads = default,
int? intraOpNumThreads = default)
=> new OnnxScoringEstimator(CatalogUtils.GetEnvironment(catalog), new[] { outputColumnName }, new[] { inputColumnName },
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is where the API compat is failing. You will need either need to make a new method here, or make an onnx options class that encapsulates these.

@eerhardt do you think an onnx options class would be better?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See #1798. The established pattern is:

  1. A "simple" public API that takes all the required arguments.
  2. An "advanced" overload that takes the required arguments plus an Options class.

And if (1) is sufficient, we can opt out of (2).

So, yes, I believe the new options should force us to create the Options class here.

/// ]]>
/// </format>
/// </example>
public static OnnxScoringEstimator ApplyOnnxModel(this TransformsCatalog catalog,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we get a test added for the new API?

bool fallbackToCpu = false,
int? interOpNumThreads = default,
int? intraOpNumThreads = default)
=> new OnnxScoringEstimator(CatalogUtils.GetEnvironment(catalog), new[] { outputColumnName }, new[] { inputColumnName },
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See #1798. The established pattern is:

  1. A "simple" public API that takes all the required arguments.
  2. An "advanced" overload that takes the required arguments plus an Options class.

And if (1) is sufficient, we can opt out of (2).

So, yes, I believe the new options should force us to create the Options class here.

@codecov
Copy link

codecov bot commented Oct 8, 2021

Codecov Report

Merging #5962 (f684cc6) into main (54a3cb9) will increase coverage by 0.01%.
The diff coverage is 84.48%.

@@            Coverage Diff             @@
##             main    #5962      +/-   ##
==========================================
+ Coverage   68.29%   68.30%   +0.01%     
==========================================
  Files        1142     1143       +1     
  Lines      242803   242840      +37     
  Branches    25385    25386       +1     
==========================================
+ Hits       165825   165878      +53     
+ Misses      70296    70290       -6     
+ Partials     6682     6672      -10     
Flag Coverage Δ
Debug 68.30% <84.48%> (+0.01%) ⬆️
production 62.98% <75.75%> (+0.01%) ⬆️
test 88.63% <96.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
src/Microsoft.ML.OnnxTransformer/OnnxUtils.cs 84.48% <50.00%> (-2.15%) ⬇️
...osoft.ML.OnnxTransformerTest/OnnxTransformTests.cs 95.53% <96.00%> (+0.08%) ⬆️
src/Microsoft.ML.OnnxTransformer/OnnxCatalog.cs 100.00% <100.00%> (ø)
src/Microsoft.ML.OnnxTransformer/OnnxOptions.cs 100.00% <100.00%> (ø)
src/Microsoft.ML.OnnxTransformer/OnnxTransform.cs 90.92% <100.00%> (+0.10%) ⬆️
test/Microsoft.ML.AutoML.Tests/AutoFitTests.cs 89.04% <0.00%> (ø)
...crosoft.ML.StandardTrainers/Optimizer/Optimizer.cs 73.12% <0.00%> (ø)
...crosoft.ML.StandardTrainers/Standard/SdcaBinary.cs 88.13% <0.00%> (ø)
src/Microsoft.ML.AutoML/Sweepers/Parameters.cs 85.59% <0.00%> (+0.84%) ⬆️
...osoft.ML.KMeansClustering/KMeansPlusPlusTrainer.cs 92.01% <0.00%> (+1.24%) ⬆️
... and 2 more

@michaelgsharp
Copy link
Member

/azp run

@azure-pipelines
Copy link

Azure Pipelines successfully started running 2 pipeline(s).

Copy link
Member

@michaelgsharp michaelgsharp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Ill merge as soon as the CI is done.

@michaelgsharp michaelgsharp merged commit 15eeef7 into dotnet:main Oct 12, 2021
@dotnet dotnet locked as resolved and limited conversation to collaborators Mar 17, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants