Skip to content

Data, Benchmarks, and methods submitted to the M5 forecasting competition

Notifications You must be signed in to change notification settings

awqb/M5-methods

 
 

Repository files navigation

M5-methods

Data, Benchmarks, and submissions made to the M5 forecasting competition

"Accuracy Submissions": Includes the forecasts of the 24 benchmarks of the M5 Accuracy competition and the submissions made by the top-50 performing methods

"Uncertainty Submissions": Includes the forecasts of the 6 benchmarks of the M5 Uncertainty competition and the submissions made by the top-50 performing methods

"Dataset": Includes the dataset of the competition, i.e., unit sales (train and test set) and information about calendar, promotions, and prices. The dataset is also available for R users in a .Rdata format.

"validation": Includes the code used for producing the forecasts of the benchmarks (both Accuracy and Uncertainty competitions)

"Scores and Ranks.xlsx": Includes the scores and ranks of the top-50 submissions of the M5 Accuracy and M5 Uncertainty competition. The scores of the benchmarks is also provided.

"M5-Competitors-Guide.pdf": Provides information about the set-up of the competition, the dataset, the evaluation measures, the prizes, the submission files and the benchmarks.

"M5_accuracy_competition.pdf": Presents the results, findings and conclusions of the M5 Accuracy competition.

About

Data, Benchmarks, and methods submitted to the M5 forecasting competition

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 96.1%
  • Python 3.8%
  • R 0.1%