Skip to content

use raw ops #1914

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 11, 2020
Merged

use raw ops #1914

merged 1 commit into from
Jun 11, 2020

Conversation

fsx950223
Copy link
Member

@fsx950223 fsx950223 commented Jun 7, 2020

Fix #1779.
cc @WindQAQ
Related google/automl#480

@bot-of-gabrieldemarmiesse

@mels630

You are owner of some files modified in this pull request.
Would you kindly review the changes whenever you have the time to?
Thank you very much.

@fsx950223
Copy link
Member Author

cc @gabrieldemarmiesse.

@seanpmorgan
Copy link
Member

Related: #1779

@WindQAQ WindQAQ self-requested a review June 10, 2020 06:28
Copy link
Member

@WindQAQ WindQAQ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @fsx950223! I believe this is correct, but the serialization/deserialization might break the saved models (for whom also save the input pipeline) due to different registration name.

Hi @seanpmorgan @gabrieldemarmiesse, should we throw some warning to tell users the implementation has changed? Thank you!

@fsx950223
Copy link
Member Author

Thank you @fsx950223! I believe this is correct, but the serialization/deserialization might break the saved models (for whom also save the input pipeline) due to different registration name.

Hi @seanpmorgan @gabrieldemarmiesse, should we throw some warning to tell users the implementation has changed? Thank you!

It won't influence saved model. Since saved model needs to load custom kernel separately.

@gabrieldemarmiesse
Copy link
Member

We have a function to load all SO, so as long as we keep the SO somewhere, we won't break existing models. This PR doesn't break existing saved models but it will remove the gradient for the op though

@fsx950223
Copy link
Member Author

Because the PR fix AA on TPUs, I believe the above issues shouldn't block it.

@WindQAQ WindQAQ self-requested a review June 11, 2020 17:30
Copy link
Member

@WindQAQ WindQAQ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks :-)

@WindQAQ WindQAQ self-requested a review June 11, 2020 17:38
@WindQAQ WindQAQ merged commit 8586698 into tensorflow:master Jun 11, 2020
ashutosh1919 pushed a commit to ashutosh1919/addons that referenced this pull request Jul 12, 2020
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Deprecate ImageProjectiveTransformV2
6 participants