Implementation of SuperPoint and AutoModelForKeypointDetection#28966
Implementation of SuperPoint and AutoModelForKeypointDetection#28966amyeroberts merged 39 commits intohuggingface:mainfrom
Conversation
|
Hi @amyeroberts @rafaelpadilla @ArthurZucker , I couldn't see other solution than to create a branch from scratch and re-implementing everything that was discussed since the beginning of the original PR. Here, most of my problems (RegNet related error when using Steven |
|
Hi @sbucaille, thanks for opening this PR and working from the previous reviews! At the moment, I don't see any tests implemented for the model or image processor. The next step would be adding those 🤗 |
|
Looks like I forgot to |
|
Hi @amyeroberts @ydshieh , Also, appears all the tests pass now 🤗 |
|
Hi @sbucaille, regarding your questions about rebasing
No, it shouldn't. Did you force push after rebasing? It's necessary to do this, as rebasing is effectively rewriting the history of the branch,
What do you mean by "the interface"? When performing a rebase, it might be necessary to resolve conflicts. What rebasing is trying to do is move the first commit of your branch to the head of main. However, as the branch was originally branched off an older commit, there might be conflicts if files in this branch make changes to files which have also been modified on The easiest way to manage this is to rebase main into the branch often, have short-lived branches and avoid merge commits in the branch. Here are some nice articles about rebasing which I found useful when first learning about it: |
|
@amyeroberts Alright I think I start to grasp the base and merge things, this PR is my first in any open source project and I knew in advance I would have been in front of such issues, so thank you for taking the time 🤗 |
|
@sbucaille I'd suggest first resolving the git history, so this branch only contains commits that are part of this PR. Once our house is tidy we can review and iterate on the PR. From first glance, things are looking good! Doing this now will be a lot easier than trying to do it in the middle of a review. |
d7f5e36 to
c1d25d0
Compare
|
@amyeroberts holy that was not easy but I think I understood the mechanism, I once more rebased the branch to include the latest commits. It seems I am now 4 commits ahead of main, which should be fine (until tomorrow morning where new commits will be pushed to main I guess). |
|
@sbucaille Great! So main will continue to have lots of new commits. The history you're seeing isn't so much number of commits "ahead" of main, rather, the number of commits on top of a certain main commit, let's say ABC. When you rebase, you're re-writing so that the commits are now on top of the latest main commit, say EDF. It's kind of like shifting things along in history. You don't need to rebase every time there's new commits on main. The purpose of rebasing is to make sure your branch and main don't diverge and any changes made to the same file has no clashes and a clear order to apply commits. As well as including any important upstream changes e.g. bug fixes. It's basically the same as having a merge commit but a lot cleaner as only the branches commits are shown.
Yep! This just lets me know there's been a rewrite of the history. This is important as rebasing can also be used for editing the commits in the branch itself.
It depends. If you touch a lot of common files, you'll want to do it regularly. Particularly in the final steps when getting ready to merge the branch in - at least once a day. This is a model PR, so probably doesn't need it quite so often. Now you've done the difficult rebase, all the rest should be pretty easy to do. In fact, I'd image there would be little to no conflicts. We'd like all branches to live as short as possible. This is less true for model PRs - but we still would like them to be resolved in the order of days/weeks wherever possible.
Great! Next step is to get all the tests passing on the CI. |
|
@amyeroberts Seems right, now. Turns out I uncommitted certain files during my rebase, I'll need to practice a bit more in the future 😅 Tests are passing ! |
amyeroberts
left a comment
There was a problem hiding this comment.
Thanks for all the work adding this model.
Beautiful PR - really nice work. Just a few small comments to address but I think we're close to merge 🤗
README_zh-hant.md
Outdated
There was a problem hiding this comment.
This PR shouldn't change the StableLM entry
There was a problem hiding this comment.
This should also be fixed with the latest rebase
src/transformers/modeling_outputs.py
Outdated
There was a problem hiding this comment.
Let's move this to the modeling file. At the moment we don't have other models using and it's highly specific to this one model
There was a problem hiding this comment.
This doesn't match the default size format with "height" and "width"
There was a problem hiding this comment.
I realise this was originally my suggestion, but reflecting and looking at this again I think we don't need to add this special class. We can remove MODEL_FOR_KEYPOINT_DETECTION_MAPPING_NAMES and just use AutoModel where needed
There was a problem hiding this comment.
I believe this is fixed with this commit
There was a problem hiding this comment.
All custom layers should accept the config as their first argument
There was a problem hiding this comment.
I am not sure what to do here, because if I add config as argument, it will not be used by the module because it relies only on in_channels, out_channels and add_pooling. But it can't rely exclusively on config since the SuperPointConvBlock can't know what values to pick from encoder_hidden_sizes ? 🤔
There was a problem hiding this comment.
That's OK, it should still have it as its first argument though
There was a problem hiding this comment.
No need to add a separate method here - just put all the logic from infer_on_model here
d83081d to
cb344ec
Compare
|
Hey @amyeroberts , |
amyeroberts
left a comment
There was a problem hiding this comment.
Looks great - thanks for iterating on this!
Just a few small comments. The final thing to do before merging in is to transfer the checkpoints to under the Magic Leap org. I can't remember the outcome of this - did you reach out to any of the authors to ask if they'd like to control the org on the hub (I don't believe it exists yet).
There was a problem hiding this comment.
You don't need this conditional logic here - you already know the architecture of the model
There was a problem hiding this comment.
That's funny - are you sure the pixel values throughout the model are in (b, c, w, h) order and not (b, c, h, w)?
There was a problem hiding this comment.
You are right, I always get confused in dimension orders, the input of the model is indeed (b, c, h, w). The tests are confusing, I fixed it all in this commit
src/transformers/models/superpoint/image_processing_superpoint.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
last_hidden_state should be the first returned value
There was a problem hiding this comment.
last_hidden_states should be the first returned value here
There was a problem hiding this comment.
That's OK, it should still have it as its first argument though
|
Hi @amyeroberts, |
|
@sbucaille OK, great. I'll give a final review. I've created https://huggingface.co/magic-leap-community where we can store the checkpoints. I'll reach out to magic leap and ask if they want to take control of it, but we can use that in the meantime. If you share a link to all the checkpoints for the model, I can copy them across to this org and then you can update the checkpoints in the PR |
|
Hi @amyeroberts , |
|
@sbucaille Great! You can get me on my HF email: amy@huggingface.co |
|
@sbucaille I've copied across the checkpoints - so they can be updated in the PR now. Could you open a PR on https://huggingface.co/magic-leap-community/superpoint to fill in the model card? |
|
Hi @amyeroberts, as you can probably see, I've added you to the mail thread and opened a pull request to the hub superpoint repo |
70ab66b to
40a8880
Compare
|
Also, just rebased the branch and figured out main was not passing tests because of repo-consistency checks, I've added SuperPoint to the appropriate README's and will watch closely when main is updated to rebase once again with what it missing there |
amyeroberts
left a comment
There was a problem hiding this comment.
Looks great - thanks for all your work on adding this model!
Last things to do are rebase on main and trying to resolve the weird diff we're still seeing on README.fr
aed5172 to
949c6d9
Compare
|
Hi @amyeroberts , Apart from that, my branch is rebased on the recent commits, let me know what we should do ! |
amyeroberts
left a comment
There was a problem hiding this comment.
Thanks for all the work adding this and digging into the funny README issue.
Only thing left to do is update the checkpoints!
There was a problem hiding this comment.
Checkpoint should be updated here
There was a problem hiding this comment.
Checkpoints should be updated here
There was a problem hiding this comment.
Checkpoint to be updated
There was a problem hiding this comment.
And here for the checkpoint
|
@sbucaille Awesome work adding this. If you can get the checkpoints updated for today we might be able to have this model make the next release 💪 |
|
Hi @amyeroberts , |
|
@sbucaille Great! Last thing (seriously this time) - can you run the slow tests for the model and share the output? Once you can confirm they all pass I'll merge |
|
@amyeroberts Is a screenshot sufficient ? When moving from the previous pull request to this one I forgot to add the additional COCO image. |
Main thing I'd need to see is the end of the test run, which shows how many of the tests have passed / been skipped etc. (I. can't see the command being run, but make sure to set RUN_SLOW=1) I've commented on the part causing the error for the doctests |
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
I applied your changes on the doc |
|
(thank you for requesting running slow tests @amyeroberts and thank you for running the slow tests @sbucaille ❤️ !) |
|
@amyeroberts I'm afraid the error is back 🤔 |
|
@sbucaille I was wrong about the fix then! I'll check out your branch and see if I can figure out what's happening |
don't hesitate to ping me if you have limited bandwidth @amyeroberts |
Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com>
|
@amyeroberts should I revert the previous change ? |
|
@sbucaille For the code example? No, that also wasn't correct (I just missed it in my reviews 🙈 ) |
|
OK - new issue. This one I think is because there isn't a model entry added in |
|
@amyeroberts Added it to _toctree, I believe this belongs to the vision section ? |
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
@sbucaille Yep! And all passing now! Thanks for all your work on this - great to see this added to the library 🤗 |
|
Yaaay finally ! 😂 Thanks for the opportunity and all the feedback provided during this long lasting PR, I learned a lot of things ! |
* Added SuperPoint docs * Added tests * Removed commented part * Commit to create and fix add_superpoint branch with a new branch * Fixed dummy_pt_objects * Committed missing files * Fixed README.md * Apply suggestions from code review Fixed small changes Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * Moved ImagePointDescriptionOutput from modeling_outputs.py to modeling_superpoint.py * Removed AutoModelForKeypointDetection and related stuff * Fixed inconsistencies in image_processing_superpoint.py * Moved infer_on_model logic simply in test_inference * Fixed bugs, added labels to forward method with checks whether it is properly a None value, also added tests about this logic in test_modeling_superpoint.py * Added tests to SuperPointImageProcessor to ensure that images are properly converted to grayscale * Removed remaining mentions of MODEL_FOR_KEYPOINT_DETECTION_MAPPING * Apply suggestions from code review Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * Fixed from (w, h) to (h, w) as input for tests * Removed unnecessary condition * Moved last_hidden_state to be the first returned * Moved last_hidden_state to be the first returned (bis) * Moved last_hidden_state to be the first returned (ter) * Switched image_width and image_height in tests to match recent changes * Added config as first SuperPointConvBlock init argument * Reordered README's after merge * Added missing first config argument to SuperPointConvBlock instantiations * Removed formatting error * Added SuperPoint to README's de, pt-br, ru, te and vi * Checked out README_fr.md * Fixed README_fr.md * Test fix README_fr.md * Test fix README_fr.md * Last make fix-copies ! * Updated checkpoint path * Removed unused SuperPoint doc * Added missing image * Update src/transformers/models/superpoint/modeling_superpoint.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * Removed unnecessary import * Update src/transformers/models/superpoint/modeling_superpoint.py Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> * Added SuperPoint to _toctree.yml --------- Co-authored-by: steven <steven.bucaillle@gmail.com> Co-authored-by: amyeroberts <22614925+amyeroberts@users.noreply.github.com> Co-authored-by: Steven Bucaille <steven.bucaille@buawei.com>



What does this PR do?
This PR implements SuperPoint, one of the few models that generate keypoints and descriptors given an image, as discussed in this previous pull request
The goal is to implement this model and a new type of AutoModel :
AutoModelForKeypointDetection(name to be discussed).This PR is also the replacement of a previous PR for which the branch ended up being unusable.
Who can review?
@amyeroberts @ArthurZucker @rafaelpadilla
TODO's
AutoModelForKeypointDetectionmapping