New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem with loss on an output layer in a subnet, which is in a recursive layer #556
Comments
The test case has The log is incomplete. Here the complete log from Slack:
|
Can confirm it works without |
The original bug is not solved. It's just a coincidence that you run not into this which proper optimization because in your case it happens that all layers are moved out and then the problem does not occur. |
Ok. I see. |
Do you get this error with the latest RETURNN version? |
Defining a loss on an output layer in a subnet, which itself is in a recursive net, throws an error. See config and error log below.
@albertz Points discussed in Slack:
_SubnetworkRecCell.get_output
, there is some logic to add fill targets with data. see for key insorted(used_keys)
and related codeused_keys
'source'
in the output layer, it works, becausesource
has a valid placeholder when the layer is initialized, butclasses
notThe text was updated successfully, but these errors were encountered: