Funny behaviour of Tikhonov reconstruction dtype (complex vs real)
Setting the dtype explicitly like rec = Tikhonov(raw0.shape, fresnel_number, alpha=alpha, betadelta=betadelta, dtype=torch.float32)
with the idea of having a single-precision result of the reconstruction yields wrong results. The dtype parameter is passed on to the FresnelTikhonovSMO
and therein to FresnelTFPropagator
, which results in a real convolution kernel, which needs to be complex. Using dtype=torch.complex64
yield more realistic reconstructions, but with dtype=float32. This behaviour is different for the CTF reconstruction, where dtype determines the output dtype. Maybe removing the dtype parameter for Tikhonov would make sense (since complex32 is not supported and I don't see where complex128 makes sense). Otherwise a clarifying comment would be helpful in the docstring.