Skip to content

Commit

Permalink
revert 49a96b9 due to conflicts during training
Browse files Browse the repository at this point in the history
  • Loading branch information
lstein committed Sep 12, 2022
1 parent 7dee9ef commit bf1beaa
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions ldm/modules/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -297,9 +297,9 @@ def forward(self, x, context=None):

def _forward(self, x, context=None):
x = x.contiguous() if x.device.type == 'mps' else x
x += self.attn1(self.norm1(x))
x += self.attn2(self.norm2(x), context=context)
x += self.ff(self.norm3(x))
x = self.attn1(self.norm1(x)) + x
x = self.attn2(self.norm2(x), context=context) + x
x = self.ff(self.norm3(x)) + x
return x


Expand Down

0 comments on commit bf1beaa

Please sign in to comment.