Skip to content

Conversation

@thaink
Copy link
Member

@thaink thaink commented Dec 7, 2020

Description

Add MaxUnpooling2D layer

Type of change

Checklist:

  • I've properly formatted my code according to the guidelines
    • By running Black + Flake8
    • By running pre-commit hooks
  • This PR addresses an already submitted issue for TensorFlow Addons
  • I have made corresponding changes to the documentation
  • I have added tests that prove my fix is effective or that my feature works
  • This PR contains modifications to C++ custom-ops

How Has This Been Tested?

I run the layer_test

@boring-cyborg boring-cyborg bot added the layers label Dec 7, 2020
@google-cla google-cla bot added the cla: yes label Dec 7, 2020
@bhack
Copy link
Contributor

bhack commented Dec 7, 2020

#632

Copy link
Member

@WindQAQ WindQAQ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice to see something come from TF Team! LGTM. Remember to import it in
https://github.com/tensorflow/addons/blob/master/tensorflow_addons/layers/__init__.py
We are now hostile to experimental APIs so it won't pass source code tests.

cc @seanpmorgan and @bhack. I do think we should relax the constraints for experimental APIs. Given we only pin to stable TF version, I believe there are more pros than cons.

A side note: this feature has been requested for a long time. As the PR comes from TF, we can bypass the eco-system review IMO.

@boring-cyborg boring-cyborg bot added the test-cases Related to Addons tests label Dec 7, 2020
@bhack
Copy link
Contributor

bhack commented Dec 7, 2020

/cc @sdmonov What do you think?

@thaink
Copy link
Member Author

thaink commented Dec 9, 2020

PTAL

WindQAQ
WindQAQ previously approved these changes Dec 9, 2020
# This allowlist should not grow. Do not add elements to this list.
allowlist = [
"tensorflow_addons/optimizers/weight_decay_optimizers.py",
"tensorflow_addons/layers/max_unpooling_2d.py",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gentle ping to @seanpmorgan. I think we should relax the constraints to experimental APIs as we only pin to stable tensorflow version. More pros than cons I believe.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we confirm that experimental APIs are aliased to their final API once they move out of experimental? If so then I think its okay, but I'd still like to track the usage in this source code test since it should be discouraged when possible. The API parameters are still able to change when graduating

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc @thaink for API questions. If so, I think we should also add test files into source code tests. Like mixed precision API in our tests, they do change some parameters/underlying implementation from experimental to stable one, but we just do not use them so there is no regression issue.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WindQAQ Can you point me to that test file?
The experimental_implements here is only used for tflite conversion so it is not possible to add tests about that at this point since the conversion code is dependent on this. Besides, since it will only affect tflite users if changed, I think it might be better to add the tests on our side.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the confusion. It's unrelated to this PR. We have used some experimental API here. run_functions_eagerly since 2.2 and mixed_precision since 2.3, but because they are used in unit tests, we're not hostile to them.

https://github.com/tensorflow/addons/blob/master/tensorflow_addons/utils/test_utils.py

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @WindQAQ.
Is there anything more I should do before submitting this PR?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, let's wait for @seanpmorgan 's response. Thanks again for the patience.

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks very nice, thanks for the contribution! Could you please to add yourself as a CODEOWNER for this and then we can merge.
https://github.com/tensorflow/addons/blob/master/.github/CODEOWNERS

@bhack bhack linked an issue Dec 10, 2020 that may be closed by this pull request
@thaink
Copy link
Member Author

thaink commented Dec 10, 2020

Looks very nice, thanks for the contribution! Could you please to add yourself as a CODEOWNER for this and then we can merge.
https://github.com/tensorflow/addons/blob/master/.github/CODEOWNERS

I added myself to .github/CODEOWNERS. Thanks.

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks!

@seanpmorgan seanpmorgan merged commit 9bac29c into tensorflow:master Dec 10, 2020
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
* Add MaxUnpooling2D op to tensorflow addons
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Unpooling layer in tensorflow

4 participants