New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support "mask" parameter passed as 4th input of DeformableConv2D
node.
#24347
Support "mask" parameter passed as 4th input of DeformableConv2D
node.
#24347
Conversation
deformable_groups)}; | ||
} else { | ||
FRONT_END_GENERAL_CHECK(false, "Invalid number of inputs"); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please add test(s) for new operation support: https://github.com/openvinotoolkit/openvino/tree/master/src/frontends/onnx/tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I pushed a changeset regarding a test for DeformableConv2D with mask input.
It includes the new test case, deformable_conv_2d_with_mask.prototxt
, for DeformableConv2D with mask input.
Let me share my confirmation.
Results of ov_onnx_frontend_tests
After I had built OpenVINO with -DENABLE_TESTS=ON
flag for cmake, I verified that the ov_onnx_frontend_tests
command passed as follows:
$ ../bin/intel64/Release/ov_onnx_frontend_tests --gtest_filter=*CPU*deform*
Running main() from src/frontends/tests/frontend/shared/gtest_main_manifest/main.cpp:20
Note: Google Test filter = *CPU*deform*:-:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoFiles/onnx:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoStreams/onnx
[==========] Running 2 tests from 1 test suite.
[----------] Global test environment set-up.
[----------] 2 tests from IE_CPU
[ RUN ] IE_CPU.onnx_model_deformable_conv_2d
[ INFO ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[ OK ] IE_CPU.onnx_model_deformable_conv_2d (34 ms)
[ RUN ] IE_CPU.onnx_model_deformable_conv_2d_with_mask
[ INFO ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[ OK ] IE_CPU.onnx_model_deformable_conv_2d_with_mask (10 ms)
[----------] 2 tests from IE_CPU (44 ms total)
[----------] Global test environment tear-down
[==========] 2 tests from 1 test suite ran. (44 ms total)
[ PASSED ] 2 tests.
As the above shows, the newly added test case, onnx_model_deformable_conv_2d_with_mask
, passed.
Just to be sure, I also verified that the original implementation without my pull request caused a failure as follows:
$ ../bin/intel64/Release/ov_onnx_frontend_tests --gtest_filter=*CPU*deform*
Running main() from src/frontends/tests/frontend/shared/gtest_main_manifest/main.cpp:20
Note: Google Test filter = *CPU*deform*:-:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoFiles/onnx:ONNXLoadTest/FrontEndLoadFromTest.testLoadFromTwoStreams/onnx
[==========] Running 2 tests from 1 test suite.
[----------] Global test environment set-up.
[----------] 2 tests from IE_CPU
[ RUN ] IE_CPU.onnx_model_deformable_conv_2d
[ INFO ] Verifying match of <= 22 mantissa bits (24 bits precision - 2 tolerance). 0 value(s) below min_signal: 0 Loosest match found is 24 mantissa bits.
[ OK ] IE_CPU.onnx_model_deformable_conv_2d (34 ms)
[ RUN ] IE_CPU.onnx_model_deformable_conv_2d_with_mask
unknown file: Failure
C++ exception with description "vector::_M_range_check: __n (which is 2) >= this->size() (which is 2)" thrown in the test body.
[ FAILED ] IE_CPU.onnx_model_deformable_conv_2d_with_mask (10 ms)
[----------] 2 tests from IE_CPU (44 ms total)
[----------] Global test environment tear-down
[==========] 2 tests from 1 test suite ran. (44 ms total)
[ PASSED ] 1 test.
[ FAILED ] 1 test, listed below:
[ FAILED ] IE_CPU.onnx_model_deformable_conv_2d_with_mask
1 FAILED TEST
Results of pytest ./src/frontends/onnx/tests/tests_python
I also verified pytest ./src/frontends/onnx/tests/tests_python
passed though xfailed
occurred.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @murase-masamitsu , great job! Btw, did you capture tests_python output? why it is still xfailed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Dear @gkrivor,
I'm sorry for the confusion.
why it is still xfailed?
These xfailed
tests are not related to this pull request. It seems that they also occurred without this pull request.
My test added in this pull request is included as part of ov_onnx_frontend_tests
and it does not affect pytest results.
I verified pytest ./src/frontends/onnx/tests/tests_python
for just in case.
Btw, did you capture tests_python output?
Let me share the result of pytest ./src/frontends/onnx/tests/tests_python
.
$ pytest ./src/frontends/onnx/tests/tests_python
============================================= test session starts ==============================================
platform linux -- Python 3.9.2, pytest-8.0.2, pluggy-1.5.0
rootdir: /app
collected 3224 items
src/frontends/onnx/tests/tests_python/test_backend.py .s.s.s.s.sxsxsxsxs.s.s.sxsxsxsxsxsxsxsxsxsxsxsxsxs [ 1%]
xs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 4%]
xs.s.s.s.s.s.s.s.s.s.sxsxsxsxsxs.s.sxsxs.sxs.sxsssssssssssss.s.s.s.s.s.s.s.s.s.sss.s.sxs.s.s.sss.s.s.sss [ 8%]
.s.s.s.s.s.s.s.s.sxsxsxsxs.sxsxsxsxsxsxsxsxs.s.s.sxsxsxsxsxsxsxsxsxsxsxsxsxsxs.s.s.s.s.s.s.s.s.s.sxsxsxs [ 11%]
xsxsxsxsxs.s.s.s.s.s.sxsxsxsxsxsxsxsxsxsxsxsxs.s.s.s.s.s.sxsxsxsxs.s.sxsss.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 14%]
.s.s.s.s.s.s.s.s.sxsxsxsxsxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.sxs.sss.s.s.s.s.s.s.s.s.s.s.sxs.s.s.s.s.s.s [ 17%]
.s.s.s.s.s.s.s.s.s.s.sxsxs.s.s.s.sxsxsxsxsxsxsxsxsxsxsxsxs.s.s.s.s.s.s.s.s.s.sxsxs.s.s.s.s.sxs.s.s.s.s.s [ 20%]
.s.s.s.s.s.s.sxsxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.sxs.s.s.s.s.s.s.sss.s.s.s.s.s.s.s.s.s.s [ 24%]
.s.s.s.s.s.s.s.s.s.s.s.s.s.s.sssssssssssssss.sss.s.s.s.sssssssss.s.sxs.sxsxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 27%]
.s.s.s.s.s.s.s.s.s.s.s.s.sxsxs.sxsxsxsxsxsxsxsxsxsxsxs.s.s.sss.s.s.s.sss.s.sss.sxsss.sxsss.s.sss.s.sss.s [ 30%]
xsss.sxsss.sxsss.sxsss.s.sss.s.sss.sxsss.sxsss.sxsss.sxsss.sxsss.sxsss.s.sss.sxs.s.s.s.s.s.s.s.s.s.s.s.s [ 33%]
.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.sxsxsxsxsxsxsxsxsxsxs.s.sxs.s.sxs.s.s.s.s.s.s.s.s.s.s.s.s [ 37%]
.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.sxsxs.s.s.sxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 40%]
.s.s.s.s.s.s.sxsxs.s.s.s.s.s.sxs.s.sxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 43%]
.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.sxs.s.s.sxsxsxsxsxsxsxsxsxsxsxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 46%]
.s.s.s.s.s.s.s.sxsxs.s.s.s.s.s.s.s.s.s.sxs.sxs.sxs.sxs.sxs.sxs.sxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 49%]
xsxs.s.sxsxs.s.s.s.s.s.sxs.sxs.sxsssxs.sxs.sxs.sxs.s.s.s.s.s.s.s.s.s.s.s.s.s.sxsxsxsxsxsxs.s.s.s.s.s.s.s [ 53%]
.s.s.s.s.sxsxsxsxsxsxsxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.sxs.sxs.sxs.sxs.sxs.sxs.sxs.s.sxsxsxs.s.sxs.s.s.s.s [ 56%]
.s.s.s.s.sxsxsxs.sxsxs.sxs.sxs.sxs.s.sxsxsxsxsxsxs.s.s.s.s.sxsxsxs.s.sxsxs.s.s.sxs.s.s.s.s.s.sxs.s.s.s.s [ 59%]
.s.s.s.s.s.s.sxs.sxsxsxs.s.s.s.s.s.s.s.s.s.s.s.sss.sss.sss.sss.sxs.sxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 62%]
.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.sxsxsxsxsxsxsxsxsxsxsxsxsxsxs.s.s.s.s [ 66%]
.s.s.s.s.s.s.s.s.s.s.s.s.sxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 69%]
.s.s.s.s.s.s.s.s.sxsxs.s.s.s.s.s.sxsxsxs.sss.sssssss.s.s.s.sssssxsxsxsxsxsxsxsxsxsxsxsxsxsxsxsxsxsxsxs.s [ 72%]
.s.s.s.s.s.s.s.s.s.sxsxsxsxsxsxsxs.s.s.s.s.s.sss.s.s.s.sxsxsxsxsxsxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 75%]
ss.s.s.s.s.s.s.s.s.s.s.sxsxsxsxsxsxsxs.s.s.sxs.s.s.s.s.s.s.s.s.s.s.s.sxsxsxsxsxsxsxsxsxsxs.s.s.sxsxsxsxs [ 78%]
xsxsxs.sxsxsxs.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 82%]
.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s.s [ 85%]
.s.s.s.s.s.s.s.s.s.s.s.s.s.s.sssssssssssssssssss [ 86%]
src/frontends/onnx/tests/tests_python/test_frontend_extension.py sss [ 87%]
src/frontends/onnx/tests/tests_python/test_frontend_lib_close.py .. [ 87%]
src/frontends/onnx/tests/tests_python/test_frontend_onnx.py ........................ss. [ 87%]
src/frontends/onnx/tests/tests_python/test_frontend_onnx_editor.py ..................................... [ 89%]
............................... [ 90%]
src/frontends/onnx/tests/tests_python/test_frontendmanager.py ..ssssssssssssssssssssssssssssssssssssssss [ 91%]
ssssssss [ 91%]
src/frontends/onnx/tests/tests_python/test_onnx_external_data.py .. [ 91%]
src/frontends/onnx/tests/tests_python/test_onnx_import.py .. [ 91%]
src/frontends/onnx/tests/tests_python/test_ops_batchnorm.py . [ 91%]
src/frontends/onnx/tests/tests_python/test_ops_binary.py ............ [ 92%]
src/frontends/onnx/tests/tests_python/test_ops_convpool.py ............. [ 92%]
src/frontends/onnx/tests/tests_python/test_ops_logical.py ....... [ 92%]
src/frontends/onnx/tests/tests_python/test_ops_matmul.py ................... [ 93%]
src/frontends/onnx/tests/tests_python/test_ops_nonlinear.py ............... [ 93%]
src/frontends/onnx/tests/tests_python/test_ops_random.py .. [ 93%]
src/frontends/onnx/tests/tests_python/test_ops_reduction.py .....ssssssssssssssssssssssssssss.......ssss [ 95%]
ssssssssssssssssssssssssssss........sssssssssssssssss.. [ 96%]
src/frontends/onnx/tests/tests_python/test_ops_reshape.py ..x..........xx....... [ 97%]
src/frontends/onnx/tests/tests_python/test_ops_unary.py ................................................ [ 99%]
.......................... [ 99%]
src/frontends/onnx/tests/tests_python/test_ops_variadic.py .... [100%]
=============================== 1306 passed, 1597 skipped, 321 xfailed in 29.08s ===============================
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you add "-v" into a command line to get additional messages, need to find why it still fails.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm sorry for the inconvenience.
Unfortunately it isn't a verbose log... Are you sure you've added -v to cmdline?
I ran pytest -v src/frontends/onnx/tests/tests_python
and I got the following outputs as I attached in the previous comment:
======================================================================================================= test session starts =======================================================================================================
platform linux -- Python 3.9.2, pytest-8.0.2, pluggy-1.5.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /app
collected 3224 items
src/frontends/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_abs_cpu PASSED [ 0%]
src/frontends/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_abs_cuda SKIPPED (Backend doesn't support device CUDA) [ 0%]
src/frontends/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_acos_cpu PASSED [ 0%]
src/frontends/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_acos_cuda SKIPPED (Backend doesn't support device CUDA) [ 0%]
src/frontends/onnx/tests/tests_python/test_backend.py::OnnxBackendNodeModelTest::test_acos_example_cpu PASSED [ 0%]
... snip ...
Isn't this a verbose log?
I also saved outputs of the command, pytest -vv src/frontends/onnx/tests/tests_python
, to get more detailed log.
I attach it as tests_python_vv.log
I hope this helps you!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please, try "-v --log-cli-level=10"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the reply.
I ran pytest -v --log-cli-level=10 src/frontends/onnx/tests/tests_python
and got tests_python_log_cli_level_10.log.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Try this, please:
pytest src\frontends\onnx\tests\tests_python -k "deform" --runxfail
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the output of pytest src\frontends\onnx\tests\tests_python -k "deform" --runxfail
.
tests_python_deform_xfail.log
It seems that these failures are caused by an onnx operator, DeformConv
, in test cases.
In my understanding, OpenVINO currently supports DeformableConv2D
, which is a custom operator provided by OpenVINO, but it does not support the native onnx operator DeformConv
.
Threfore these failures are expected.
build_jenkins |
Details:
openvino.convert_model
in Python, so thatDeformableConv2D
can handle the 4th input asmask
parameter, which is described in DeformableConvolution-8.Tickets:
mask
input forDeformableConv2D
is ignored when ONNX model is converted into OpenVINO IR format using Pythonopenvino.convert_model
. #24346