-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please provide trained models #158
Comments
Actually, this model does not need 8 V100, you can simply comment the codes in the three python files which judge if you have 8 V100 to run this project. It takes about 20 hours to train the model on a single RTX3090. |
You can find all the pretrained models in the online demos in our project page https://mobile-nerf.github.io You can also download them and run them locally. (You will need some html knowledge to download them.) |
Hello, I am not very knowledgeable about html. Where to download the network_weights in the webpage code, what is the specific download link, can you give an example? |
@njfugit I have worked out how to get all of the files needed for a pre-trained model. If you want to download the models for your own use you can get files via this url for synthetic scenes: storage.googleapis.com/jax3d-public/projects/mobilenerf/mobilenerf_viewer/[object_name]_phone/ where [object_name] is the name of the object you are downloading weights for, and this url for all non-synthetic scenes: storage.googleapis.com/jax3d-public/projects/mobilenerf/mobilenerf_viewer_mac/[object_name]_mac/ For each scene you will need an mlp.json file, N .obj files (where N is determined by the 'object_num' key inside mlp.json), and N*2 .png files (2 files for each object_num) The mlp.json file is at /mlp.json. This contains the mlp weights, and will tell you how many objects there are for a given scene under the key ['object_num']. Using the number of objects you can then get the obj and png files. The png files will be at /shape[obj_num].pngfeat0.png and /shape[obj_num].pngfeat1.png All of these files should then be stored in a directory named '/[obj_name]_phone' inside your mobilenerf directory. As a concrete example, if you wish to get the pretrained models for the chair scene, you will need to do the following: Create directory: Download mlp weights: Download your obj files: Download your png files: You can then run your own http server from your /mobilenerf directory as instructed in README.md I've only tested this fully for one example, but it looks like it will hold for the others. If you find something doesn't work for a different scene ping me and hopefully we can workout what the correct URL would be. It's also worth noting that the code uses WebGL, so if you're interested in seeing performance on your own machine you don't need to do any of this. If you just load the viewer on the project website all the graphics will be done locally on your own GPU. |
@DWhettam Thanks a lot for your reply, it works. |
Hi,
I would like to try your MobileNerf project on my desktop and mobile phone, but I don't have 8 V100 GPUs to train the models. Probably very few people do...
Can you provide, e.g. as a download on a Google Drive, the trained data that could be placed into the folder that has the HTML, so that people can at least try the demo scenes?
The text was updated successfully, but these errors were encountered: