Skip to content

Conversation

bewantbe
Copy link

@bewantbe bewantbe commented May 3, 2018

Fix issues #370

Now the gradient at zero (origin) point of np.linalg.norm() is the same as np.abs, which is zero, one of its subgradient.
For second order gradients, mathematically they should be +infinity, but
here when ord>=2 it returns 0 (same as np.abs()), when 1<ord<2, it is
NaN with plenty of warnings, which should be enough to prevent user from
doing that.

Also note that when x is complex number, the gradient of the norm seems wrong according to your document, there should be a conj(x) in the expression, see also the comments in the committed code.

Fix issues HIPS#370

Now the gradient at zero (origin) point of np.linalg.norm() is the same as np.abs, which is zero, one of its subgradient.
For second order gradients, mathematically they should be +infinity, but
here when ord>=2 it returns 0 (same as np.abs()), when 1<ord<2, it is
NaN with plenty of warnings, which should be enough to prevent user from
doing that.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant