-
Notifications
You must be signed in to change notification settings - Fork 928
Closed
Labels
bugThis item is a bug that needs to be investigated or fixedThis item is a bug that needs to be investigated or fixed
Description
Hi all, thanks for such a still useful project.
The convention adopted in autograd
for complex numbers is generally that parameters can be updated in the direction of conj(g)
to minimize a function. linalg.norm
follows the opposite convention, preventing it from being used in code that combines it with other functions.
This was mentioned in #393 alongside some potential fixes.
Minimal reproducer:
import numpy as np
import autograd.numpy as anp
from autograd import value_and_grad
rng = np.random.default_rng(42)
x = rng.normal(size=10) + 1j * rng.normal(10)
y = rng.normal(size=10) + 1j * rng.normal(10)
def foo(x, y):
# simple loss function
return anp.linalg.norm(x - y)
def foo2(x, y):
# alt impl
d = x - y
return anp.sum(anp.abs(d)**2)**0.5
# use foo2 for correct behavior
foo_vag = value_and_grad(foo)
for _ in range(10):
v, g = foo_vag(x, y)
# or remove conj here to make foo work
x -= 0.05 * np.conj(g)
# value should initially decrease
print(v)
# 5.761837589211455
# 5.785341294273737
# 5.8095148223718915
# 5.834344369254469
# 5.8598161062729055
# 5.885916205539433
# 5.912630863722038
# 5.939946324450408
# 5.967848899318491
# 5.996324987479975
autograd v1.7.0.
danra
Metadata
Metadata
Assignees
Labels
bugThis item is a bug that needs to be investigated or fixedThis item is a bug that needs to be investigated or fixed