New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
model: pytorch: pretrained: Add support for additional layers Python API #1148
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Codecov Report
@@ Coverage Diff @@
## master #1148 +/- ##
==========================================
- Coverage 84.63% 84.60% -0.04%
==========================================
Files 156 156
Lines 10180 10184 +4
Branches 1677 1679 +2
==========================================
Hits 8616 8616
- Misses 1215 1218 +3
- Partials 349 350 +1
Continue to review full report at Codecov.
|
Pinging @sakshamarora1 and or @yashlamba for review |
|
pdxjohnny
added a commit
to mhash1m/dffml
that referenced
this pull request
Jul 6, 2021
Array compatibility fix with numpy Related: intel#1147 Related: intel#1148 Co-authored-by: Hashim <hashimchaudry23@gmail.com> Signed-off-by: John Andersen <johnandersenpdx@gmail.com>
pdxjohnny
force-pushed
the
u5_transferlearning
branch
from
July 6, 2021 17:47
9da27e9
to
b27c172
Compare
This was referenced Jul 6, 2021
pdxjohnny
added a commit
to mhash1m/dffml
that referenced
this pull request
Jul 6, 2021
Array compatibility fix with numpy Fixes: intel#1152 Related: intel#1147 Related: intel#1148 Co-authored-by: Hashim <hashimchaudry23@gmail.com> Signed-off-by: John Andersen <johnandersenpdx@gmail.com>
pdxjohnny
force-pushed
the
u5_transferlearning
branch
2 times, most recently
from
July 6, 2021 17:57
6e6b2fd
to
049a85b
Compare
Array compatibility fix with numpy Fixes: intel#1152 Related: intel#1147 Related: intel#1148 Co-authored-by: Hashim <hashimchaudry23@gmail.com> Signed-off-by: John Andersen <johnandersenpdx@gmail.com>
mhash1m
force-pushed
the
u5_transferlearning
branch
from
July 7, 2021 00:57
049a85b
to
e90f1e7
Compare
Previously only supplying a dict which would then be converted to PyTorch objects was supported. Now PyTorch objects can be supplied directly. Fixes: intel#1147 Related: intel#840 Related: intel#1151
mhash1m
force-pushed
the
u5_transferlearning
branch
from
July 7, 2021 10:17
e90f1e7
to
024941d
Compare
mhash1m
changed the title
WIP: model: pytorch: Add support for additional layers Python API
model: pytorch: pretrained: Add support for additional layers Python API
Jul 7, 2021
pdxjohnny
added a commit
to pdxjohnny/dffml
that referenced
this pull request
Mar 11, 2022
Array compatibility fix with numpy Fixes: intel#1152 Related: intel#1147 Related: intel#1148 Co-authored-by: Hashim <hashimchaudry23@gmail.com> Signed-off-by: John Andersen <johnandersenpdx@gmail.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
fixes: #1147
This PR is a dependency for the usecase example notebook I'm currently working on, ie. "Transferlearning".
I ended up setting
layers: Any
in thePyTorchPreTrainedModelConfig
and also skippingconvert_value()
inbase.py
if field type isAny
since it tries to create an instance of the type.Seems to be working as I handle the formatting of the parameter in
pytorch_pretrained
itself, but what could be the implications of this?It appears we don't have
Any
type fields, even if we add in future and want to by-pass the checks inbase.py
, we can simply provide multiple arguments toisinstance()
to check against along withdict
, to allow theconvert_value()
process.Alternatively, I could also create a
.yaml
inside the notebook (for the time being) if this approach doesn't seem right.Using
Pillow==8.2.0
to temporarily fix the pytorch errors. Seems like there's a pending fix Make Image.__array__ take optional dtype argument python-pillow/Pillow#5572 so should be okay soon