Skip to content

linalg.norm has mismatched convention for complex numbers #666

@jcmgray

Description

@jcmgray

Hi all, thanks for such a still useful project.

The convention adopted in autograd for complex numbers is generally that parameters can be updated in the direction of conj(g) to minimize a function. linalg.norm follows the opposite convention, preventing it from being used in code that combines it with other functions.

This was mentioned in #393 alongside some potential fixes.

Minimal reproducer:

import numpy as np
import autograd.numpy as anp
from autograd import value_and_grad

rng = np.random.default_rng(42)

x = rng.normal(size=10) + 1j * rng.normal(10)
y = rng.normal(size=10) + 1j * rng.normal(10)

def foo(x, y):
    # simple loss function
    return anp.linalg.norm(x - y)

def foo2(x, y):
    # alt impl
    d = x - y
    return anp.sum(anp.abs(d)**2)**0.5

# use foo2 for correct behavior
foo_vag = value_and_grad(foo)

for _ in range(10):
    v, g = foo_vag(x, y)
    # or remove conj here to make foo work
    x -= 0.05 * np.conj(g)
    # value should initially decrease
    print(v)
# 5.761837589211455
# 5.785341294273737
# 5.8095148223718915
# 5.834344369254469
# 5.8598161062729055
# 5.885916205539433
# 5.912630863722038
# 5.939946324450408
# 5.967848899318491
# 5.996324987479975

autograd v1.7.0.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugThis item is a bug that needs to be investigated or fixed

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions