Skip to content

Conversation

@sanketpurandare
Copy link
Contributor

@sanketpurandare sanketpurandare commented Nov 14, 2025

sanketpurandare added a commit that referenced this pull request Nov 14, 2025
ghstack-source-id: 54166f0
Pull Request resolved: #250
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Nov 14, 2025
Comment on lines 109 to 111
assert not any(
grad is None for grad in grads_to_accumulate
), "All grads are None"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
assert not any(
grad is None for grad in grads_to_accumulate
), "All grads are None"
assert not all(
grad is None for grad in grads_to_accumulate
), "All grads are None"

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually shouldn't we keep it as 'any' and change the string to match? or do we not care if some grads are none

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah we should keep as any and change the string.

Copy link
Member

@xmfan xmfan Nov 14, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think some grads can be None, say forward outputs that don't require gradient, the usual torch.compile fw/bw would have None in the backward graph outputs as required by the custom autograd function API.

It's up to the partitioner/graph pass splitters and the runtime wrapper implementation though. As long as we know which grad belongs to which param.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Switched it back to any

assert num_placeholders == len(
fw_args
), f"Mismatched number of inputs to fwd, {len([n for n in fw_module.graph.nodes if n.op == 'placeholder'])}, {len(fw_args)}"
), f"Mismatched number of inputs to fwd: expected {num_placeholders}, got {len(fw_args)}"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no longer needed, we made the change upstream

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code pointer?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed the args len check

sanketpurandare added a commit that referenced this pull request Nov 14, 2025
ghstack-source-id: f24398c
Pull Request resolved: #250
@sanketpurandare sanketpurandare changed the base branch from gh/sanketpurandare/2/base to main November 14, 2025 23:20
@sanketpurandare sanketpurandare changed the base branch from main to gh/sanketpurandare/2/base November 14, 2025 23:22
@sanketpurandare sanketpurandare changed the base branch from gh/sanketpurandare/2/base to main November 14, 2025 23:29
@sanketpurandare sanketpurandare changed the base branch from main to gh/sanketpurandare/2/base November 14, 2025 23:30
@sanketpurandare sanketpurandare changed the title Enabling ZeroBubbleV schedule in GraphPP Enabling ZeroBubbleV schedule in Graph PP Nov 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants