You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
LearningWithRetrospection.py contains class LWR which implements the algorithm in this paper
Algorithm
Usage
lwr=LWR(
k=1, # Number of Epochs (Interval) to update soft labelsupdate_rate=0.9, # The rate at which True Label weightage is decayednum_batches_per_epoch=len(dataset) //batch_size,
dataset_length=len(dataset),
output_shape=(10, ), # Number of Classestau=5, # Temperature -- Just leave it to 5 if you don't know what you're doingmax_epochs=20, # Max number of epochssoftmax_dim=1# Axis for softmax
)
data, target=data.to(device), target.to(device)
optimizer.zero_grad()
output=model(data)
loss=lwr(batch_idx, output, target, eval=False) # LWR expects LOGITSloss.backward()
optimizer.step()