Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Passing config['test_start_index'] to calc_grad_z #9

Open
expectopatronum opened this issue Mar 26, 2020 · 2 comments
Open

Passing config['test_start_index'] to calc_grad_z #9

expectopatronum opened this issue Mar 26, 2020 · 2 comments

Comments

@expectopatronum
Copy link
Contributor

Hi,
I am not sure if I am misunderstanding the parameter or if it shouldn't be passed to calc_grad_z:

def calc_grad_z(model, train_loader, save_pth=False, gpu=-1, start=0):

I assumed it should loop over the whole training set.

Thanks and best regards
Verena

@andrewsilva9
Copy link

I believe it does loop over the entire training set, the train_loader is a python dataloader that carries all that info with it. We can see here:

for i in range(start, len(train_loader.dataset)):
that it is iterating over the dataset that gets sent along with the dataloader (which would be the entire training set).

@nimarb
Copy link
Owner

nimarb commented Apr 22, 2020

Hi, @andrewsilva9 is correct in line 133 it loops over the entire training dataset so that one grad_z is calculated per training sample.

With the start argument you can start at a different point in the training dataset. This can be used if you split the calculation across multiple machines. You calculate samples [0-100] on machine 1 and on machine two you pass start=101 to calculate from training sample [101-x]. The end x is missing here in the implementation still...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants