Replies: 8 comments
-
Indeed we have those 3 representations and I guess it's not very clear to newcomers. Few points:
What would you like to support in this wrapper? All what is in Maybe @anjali411, @boeddeker would like to barge in on this? |
Beta Was this translation helpful? Give feedback.
-
The wrapper would have a similar/identical interface to Torch's complex numbers, maybe with a few more operations for performance, and with functions to convert between the representations. class AnyComplex:
def real(self): ...
def imag(self): ...
def abs(self): ...
def angle(self): ...
def conj(self): ...
...
# Maybe these could also live outside the class hierarchy.
def as_torchaudio_complex(self) -> Tensor[..., 2]: ...
def as_asteroid_complex(self) -> Tensor[..., 2 * n, ...]: ...
def as_torch_complex(self) -> Tensor[...]: ...
@classmethod
def from_torchaudio_complex(self, t: Tensor): ...
@classmethod
def from_asteroid_complex(self, t: Tensor): ...
@classmethod
def from_torch_complex(self, t: Tensor): ...
class TorchComplex(AnyComplex):
def real(self):
return self.tensor.real
...
class AsteroidComplex(AnyComplex):
def real(self):
return self.tensor[..., :n//2, ...]
...
class TorchaudioComplex(AnyComplex):
def real(self):
return self.tensor[..., 0]
... |
Beta Was this translation helpful? Give feedback.
-
Yes, it would be practical to have that. |
Beta Was this translation helpful? Give feedback.
-
@jonashaag Complex Tensors are in the beta stage right now but a lot of linear algebra and autograd support has been added in the last few months and there's more support incoming. There’s native support for only
Also please get involved in complex - |
Beta Was this translation helpful? Give feedback.
-
Thanks for your answer and your efforts on complex support in PyTorch! I think Jonas already spent time with the complex native support in complex_nn.py and in DCCRN and DCUNet (see in Thanks for the links to the discussions, and also don't hesitate to ping me if you need feedback on specific things. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the details @anjali411! I think BF16 and TF32 support alone would be worth changing to real valued backend at least for these models. (FP16 does not work well with some of the Asteroid models, probably needs hand tuning for grad stability in some places.) At the moment none of these can be used with PyTorch anyways, so no point switching, but let’s see when we can test it how well they work. |
Beta Was this translation helpful? Give feedback.
-
Should we do this or drop this idea? |
Beta Was this translation helpful? Give feedback.
-
I don’t know... maybe I will revisit this when/if TF32 or BF16 support have landed in PyTorch, and I get access to a GPU that supports it, for faster training. (FP16 training wasn’t successful in any of my attempts) |
Beta Was this translation helpful? Give feedback.
-
We have 3 different types of complex numbers in Asteroid: Torch "native" complex, Torchaudio complex (tensors of shape [..., 2]) and Asteroid complex (tensors of shape [..., 2 * n, ...]).
The native complex tensors have a very nice API with things like
.angle()
, but judging from my personal experience and from the number of bug reports, it is still much less stable for training than the other two representations (which are FP32 based). Another drawback that I realised recently that it doesn't allow for FP16, BF16, TF32, etc. training. I am not sure if this is a limitation of the way PyTorch represents these numbers in memory, or simply not yet implemented.I think there are three ways to deal with the situation:
frustrationedit: patience until we are there.Interested to hear your thoughts!
Beta Was this translation helpful? Give feedback.
All reactions