Skip to content

2Geeks2/LabelAwareRanked-Loss

Repository files navigation

LabelAwareRanked-Loss

In this repository, we implement different kinds of losses which are mainly Triplet loss, Multiclass-N-pair loss, Constellation loss, and LabelAwareRanked loss. We use the same smart batch structure to test. Batch is selected by using BalancedBatchSampler

Random Data Experiment

The following image shows randomly generated datapoints on a unit circle that become clustered and ranked in uniform angles after applying and optimizing the LAR loss.

Despite the clustering, we can also see that the loss converges to its minimum which is achieved for uniform angles and ranking between different labels.

The experiment on a randomly generated dataset shows that the LAR loss creates ranked embeddings in uniform angles when it is close to the optimal solution. This experiment can be executed by the following command:

  • "num_classes" ---- number of classes of the dataset (int)
  • "num data" ---- number of datapoints (int)
  • "num_iter" ---- number of iterations (int)
  • "plot" --- "True" or "False" for the plots
python gradient_descent_rnd_data.py --num_classes 7 --num_data 1000 --num_iter 1000 --plot True

MNIST Data Experiment

For MNIST the experiment can be executed by the following script. There are two parameters you can choose.

  • "loss": 4 choices for loss function
    • "triplet" ---- Triplet loss
    • "npair" --- Multiclass-N-pair loss
    • "constellation" --- Constellation loss
    • "lar" --- LabelAwareRanked loss
  • "num_epochs" --- number of epochs (int)

Run the code in command line like following:

python main.py --loss lar --num_epochs 10

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages