Skip to content

went wrong in function _project_conflicting #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
leonzgtee opened this issue Jan 12, 2021 · 2 comments
Open

went wrong in function _project_conflicting #1

leonzgtee opened this issue Jan 12, 2021 · 2 comments

Comments

@leonzgtee
Copy link

hi! i have tried the pytorch-PCGrad in my project. my network was an mobilenetv2 followed by two task-head and these two task-head have different parameters. when i ran the pytorch-pcgrad, it went wrong.it seems that the separated backward process induce two gradient vectors with different length, thus went wrong in function _project_conflicting.

@WeiChengTseng
Copy link
Owner

WeiChengTseng commented Jan 12, 2021

@leonzgtee Hi, thanks for the feedback. I already tackled the network that contains both shared and unshared parameters, so you can check whether it works for your scenario. In this implementation, the gradient of the unshared parameters remain the same, and the gradient of shared parameters is modified by PCGrad.

@leonzgtee
Copy link
Author

@WeiChengTseng thanks, but i notice that the shared in function _project_conflicting is stacked from has_grads,which may not only contain the shared parameters but also a branch of task parameters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants