Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error loading model + fix #20

Open
MichaelK-rgb opened this issue Dec 15, 2024 · 3 comments
Open

Error loading model + fix #20

MichaelK-rgb opened this issue Dec 15, 2024 · 3 comments

Comments

@MichaelK-rgb
Copy link

Hello amaralibey,
First of all, thank you for the fantastic work and paper.

There was an error after the recent updates when loading the model using this line:
vpr_model = torch.hub.load("amaralibey/bag-of-queries", "get_trained_boq", backbone_name="dinov2", output_dim=12288)
The error:
RuntimeError: Error(s) in loading state_dict for VPRModel: Unexpected key(s) in state_dict: "backbone.dino.norm.weight", "backbone.dino.norm.bias".

Editing vpr_model.load_state_dict to add 'strict=False' seems to fix it:
vpr_model.load_state_dict( torch.hub.load_state_dict_from_url( MODEL_URLS[f"{backbone_name}_{output_dim}"], map_location=torch.device('cpu') ), strict=False ) return vpr_model

@amaralibey
Copy link
Owner

Hello @MichaelK-rgb

Thank you for your interest and for reporting this issue :)

I am editing the entire code base to add training code and dataset download scripts.
The error is due to an update where I forgot to remove the norm layer of the DINOv2 backbone, which isn't used in BoQ. My apologies for the oversight.

I'm currently fixing this issue, so it should be resolved very soon. In the meantime, your workaround with strict=False is indeed a valid solution and will allow the model to function as expected.

Thanks again for your patience, and feel free to reach out if you encounter any other issues!

Best,

@tianyilim
Copy link

@MichaelK-rgb Thanks for providing a solution! I'd like to continue using BoQ, how can I resolve it locally, by modifying the following call?

vpr_model = torch.hub.load("amaralibey/bag-of-queries", "get_trained_boq", backbone_name="dinov2", output_dim=12288)

@v-pnk
Copy link

v-pnk commented Feb 13, 2025

Hello @amaralibey ,

The direct loading of the model from Torch Hub still seems to be broken. I tried to adjust the call (as mentioned in the question from @tianyilim) so that we don't have to clone the repository itself, but it does not seem to be possible. At the moment I am installing the repository as a Python package (so that I have access to BoQ and VPRModel classes) and instantiating the VPRModel as in hubconf.py with strict=False. Is there another workaround or do you plan to fix this so the example from README works again?

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants