Reimplemented proximal gradient method (PGM).
Improved usability and switched to explicit x dependence in functional F.
Furthermore,
- switching to out-of-place instead of in-place tensor operations,
- implemented two different strategies for Barzilai-Borwein step sizes,
- corrected stopping condition if proximal operator G is not the identity,
- moved setup, initialization of backtracking, gradient clearing into algorithms code; application now just need to call the
__call__
method with desired iterations or tolerance.