Skip to content

Refactor: switch GradScaler import from torch.cuda.amp to torch.amp #3437

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

Aaraviitkgp
Copy link

Fixes #3435

Description:
I have made change in code from torch.cuda.amp import GradScaler to from torch.amp import GradScaler.
• torch.amp provides a unified AMP interface across devices.
• Using torch.cuda.amp restricts AMP usage to CUDA-only environments.
• This change helps Ignite better support PyTorch’s full AMP ecosystem in a clean and future-proof way.

No new functionality added — this is a safe refactor with no effect on runtime behavior.

@github-actions github-actions bot added module: engine Engine module examples Examples labels Jul 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
examples Examples module: engine Engine module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Fix FutureWarning: torch.cuda.amp.GradScaler(args...) is deprecated. Please use torch.amp.GradScaler('cuda', args...) instead.
1 participant