Skip to content
GitLab
Projects
Groups
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
nam
ProxPython
Commits
aab45f79
Commit
aab45f79
authored
Apr 21, 2020
by
jansen31
Browse files
first try for 3d compatibility, ignoring the case where the product space is higher-dimensional
parent
98cad1a1
Changes
1
Hide whitespace changes
Inline
Side-by-side
proxtoolbox/Algorithms/SimpleAlgortihm.py
View file @
aab45f79
...
...
@@ -80,7 +80,10 @@ class SimpleAlgorithm:
norm_data
=
self
.
norm_data
iter
=
self
.
iter
if
u
.
ndim
<
3
:
# TODO: select the right case also for a 3d problem.
# This could involve using the parameter self.product_space_dimension
if
u
.
ndim
<
3
:
# temp note: as used in (e.g.) 2d CDI, or orbital tomography
p
=
1
q
=
1
elif
u
.
ndim
==
3
:
...
...
@@ -128,20 +131,15 @@ class SimpleAlgorithm:
tmp_gap
=
0
tmp_shadow
=
0
if
p
==
1
and
q
==
1
:
# tmp_change = (norm(u - u_new, 'fro') / norm_data) ** 2
# the simple case, where everything can be calculated in one difference
if
self
.
product_space_dimension
==
1
or
(
p
==
1
and
q
==
1
):
tmp_change
=
phase_offset_compensated_norm
(
u
,
u_new
,
norm_type
=
'fro'
,
norm_factor
=
norm_data
)
**
2
if
self
.
diagnostic
:
# For Douglas-Rachford,in general it is appropriate to monitor the
# SHADOWS of the iterates, since in the convex case these converge
# even for beta=1.
# (see Bauschke-Combettes-Luke, J. Approx. Theory, 2004)
# tmp_shadow = (norm(u2 - shadow, 'fro') / norm_data) ** 2
# tmp_gap = (norm(u1 - u2, 'fro') / norm_data) ** 2
# For Douglas-Rachford,in general it is appropriate to monitor the SHADOWS of the iterates,
# since in the convex case these converge even for beta=1. (see Bauschke-Combettes-Luke,
# J. Approx. Theory, 2004)
tmp_shadow
=
phase_offset_compensated_norm
(
u2
,
shadow
,
norm_factor
=
norm_data
,
norm_type
=
'fro'
)
**
2
tmp_gap
=
phase_offset_compensated_norm
(
u1
,
u2
,
norm_factor
=
norm_data
,
norm_type
=
'fro'
)
**
2
if
hasattr
(
self
,
'truth'
):
if
self
.
truth_dim
[
0
]
==
1
:
z
=
u1
[
0
,
:]
...
...
@@ -149,12 +147,10 @@ class SimpleAlgorithm:
z
=
u1
[:,
0
]
else
:
z
=
u1
# Relerrs[iter] = norm(self.truth - exp(-1j * angle(trace(self.truth.T * z))) * z,
# 'fro') / self.norm_truth
Relerrs
[
iter
]
=
phase_offset_compensated_norm
(
self
.
truth
,
z
,
norm_factor
=
self
.
norm_truth
,
norm_type
=
'fro'
)
elif
q
==
1
:
elif
q
==
1
:
# more complex: loop over product space dimension
for
j
in
range
(
self
.
product_space_dimension
):
tmp_change
=
tmp_change
+
(
norm
(
u
[:,
:,
j
]
-
u_new
[:,
:,
j
],
'fro'
)
/
norm_data
)
**
2
if
self
.
diagnostic
:
...
...
@@ -164,10 +160,10 @@ class SimpleAlgorithm:
if
self
.
diagnostic
:
if
hasattr
(
self
,
'truth'
):
z
=
u1
[:,
:,
0
]
Relerrs
[
iter
]
=
norm
((
self
.
truth
-
exp
(
-
1j
*
angle
(
trace
(
self
.
truth
.
T
.
transpose
()
*
z
)))
*
z
),
'fro'
)
/
self
.
norm_truth
Relerrs
[
iter
]
=
norm
((
self
.
truth
-
exp
(
-
1j
*
angle
(
trace
(
self
.
truth
.
T
.
transpose
()
*
z
)))
*
z
),
'fro'
)
/
self
.
norm_truth
else
:
else
:
# loop over product space dimension as well as additional z dimension
if
self
.
diagnostic
:
if
hasattr
(
self
,
'truth'
):
Relerrs
[
iter
]
=
0
...
...
@@ -184,22 +180,20 @@ class SimpleAlgorithm:
self
.
truth
-
exp
(
-
1j
*
angle
(
trace
(
self
.
truth
.
T
*
u1
[:,
:,
k
,
1
])))
*
u1
[:,
:,
k
,
1
],
'fro'
)
/
self
.
norm_Truth
# Add values to the list of change, gap and shadow_change
change
[
iter
]
=
sqrt
(
tmp_change
)
if
self
.
diagnostic
:
gap
[
iter
]
=
sqrt
(
tmp_gap
)
shadow_change
[
iter
]
=
sqrt
(
tmp_shadow
)
# this is the Euclidean norm of the gap to
# the unregularized set. To monitor the Euclidean norm of the gap to the
# regularized set is expensive to calculate, so we use this surrogate.
# Since the stopping criteria is on the change in the iterates, this
# does not matter.
# graphics
# the unregularized set. To monitor the Euclidean norm of the gap to the regularized set is
# expensive to calculate, so we use this surrogate. Since the stopping criteria is on the change
# in the iterates, this does not matter.
# update
# update
iterate
u
=
u_new
if
self
.
diagnostic
:
# For Douglas-Rachford,in general it is appropriate to monitor the
# SHADOWS of the iterates, since in the convex case these converge
# even for beta=1.
# For Douglas-Rachford,in general it is appropriate to monitor the SHADOWS of the iterates, since in
# the convex case these converge even for beta=1.
# (see Bauschke-Combettes-Luke, J. Approx. Theory, 2004)
shadow
=
u2
...
...
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment