Skip to content

Commit

Permalink
Fix torch.clamp() issue microsoft#237
Browse files Browse the repository at this point in the history
  • Loading branch information
CryptoSalamander committed Apr 11, 2023
1 parent b720b41 commit 38eba56
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion models/swin_transformer_v2.py
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ def forward(self, x, mask=None):

# cosine attention
attn = (F.normalize(q, dim=-1) @ F.normalize(k, dim=-1).transpose(-2, -1))
logit_scale = torch.clamp(self.logit_scale, max=torch.log(torch.tensor(1. / 0.01))).exp()
logit_scale = torch.clamp(self.logit_scale, max=torch.log(torch.tensor(1. / 0.01)).item()).exp()
attn = attn * logit_scale

relative_position_bias_table = self.cpb_mlp(self.relative_coords_table).view(-1, self.num_heads)
Expand Down

0 comments on commit 38eba56

Please sign in to comment.