return_attention_weights
when set to False returns attention weights in GATv2Conv
#9319
Labels
return_attention_weights
when set to False returns attention weights in GATv2Conv
#9319
馃悰 Describe the bug
When setting optional argument
return_attention_weights
in GATv2Conv's forward toFalse
it returns the attention weights. I believe it doesn't check parameter's value only if it not None.My guess is to change
return_attention_weights: Optional[bool] = None
to justreturn_attention_weights: bool = False
and appropriate if.By running
rg "isinstance(.*, bool)" -C 5
in repo other files might have similar issue:Let me know if it's a bug, or I don't get something.
Thanks
Versions
I'm using
torch-geometric 2.4.0
but in the newest repo it still occurs.The text was updated successfully, but these errors were encountered: