-
Couldn't load subscription status.
- Fork 68
TransformReplay::selfReplay replays contiguity #5316
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
!test |
|
Review updated until commit 23ee554 Description
Changes walkthrough 📝
PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
This reverts commit a41c05e.
|
!test |
|
!test |
|
!test |
|
!test |
|
!test |
|
!test |
|
!test |
| fusion->addInput(in); | ||
| fusion->addInput(i); | ||
| fusion->addOutput(acc_out); | ||
| fusion->aliasOutputToInput(acc_out, acc_in, AllocationType::ReuseBuffer); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This calls TransformReplay::selfReplay to replay transformations from concrete dimensions to symbolic dimensions.
| auto out_tensors = executor_cache.runFusionWithInputs({in_tensor}); | ||
| ASSERT_EQ(out_tensors.size(), 1); | ||
| at::Tensor out_tensor = out_tensors[0].as<at::Tensor>(); | ||
| auto out_tensor = out_tensors[0].as<at::Tensor>(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use auto because as<at::Tensor> already specifies the type.
|
!test |
|
!test |
|
It would be really helpful to have some quick PR introduction to remind the context. My mental capacity is not big enough to remember everything currently going on. |
|
Thanks @wujingyue for the clarifications. |
Done |
|
!test |
| } | ||
|
|
||
| new_self->setAllocationDomain(new_allocation, new_contiguities); | ||
| } else { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This feels a little unexpected to me because, even though there's nothing to replay for the allocation, the contiguity could be modified.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What would happen if self doesn't have an allocation domain but new_self does? Would it work?
|
!test --diff |
Changes of contiguity may not cause test failures but could result in, e.g., different vectorizations. Started the diff check just in case. |
Fixes bugs like #5356
We should probably add some knobs so callers decide what to replay, in a separate PR. So far, this function tries to replay everything (namely, loop, allocation and contiguity), which seems to work fine.