Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

i find the reason why the compostiion acc is lower when train end2end #10

Open
dongdk opened this issue Oct 14, 2024 · 1 comment
Open

Comments

@dongdk
Copy link

dongdk commented Oct 14, 2024

hi @bo-zhang-cs
when test your release model, i find that the acc is lower than the paper, which is 73%. and i try to figure out how to fix this problem. after doing multi-experiments, i i have found the reason why the compostiion acc is lower when train end2end. the gradient of cropping branch back to backbone is not good for training composition branch. so the solution is that, use detach operation to stop the gradient of cropping branch loss

            f4_detach = f4.detach()
            offsets = self.cropping_module(f4_detach)

the result i get is that epoch: 38, FCDB_iou:0.7032, FCDB_disp: 0.0728, FLMS_iou: 0.8441, FLMS_disp: 0.0360, Acc: 88.54%

@bo-zhang-cs
Copy link
Owner

Hi @dongdk:
Wow, what a fantastic discovery! 🎉 Your solution to detach the gradient from the cropping branch is brilliant—very insightful! Thanks for sharing your findings and solution—this will definitely help move things forward! 🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants