Work in progress
Repository hosting the code for the IEEE paper Causal Quantification of Cannibalization during Promotional Sales in Grocery Retail
From the csv data to the organised data (01.02.2021)
Dunnhumby_arrange_store_sales.ipynb
To summarise the sales per store (prior to the analysis)
Dunnhumby_Summarise_store_sales.ipynb
base NB to calculate the cannibalisation
Dunnhumby_CausalImpact_Analysis_base.ipynb
Use Papermill to create all the department NBs from a base one.
- base one:
runner_papermill_causal_impact_covariates.ipynbIt calculates all the potential cannibals and haloes - autogenerated: papermill_unsupervised_{DEPARTMENT}_{STORE}.ipynb
The chart showing the STL decomposition of the total sales generated with CFAV_store_sales_projection(paper).ipynb
To summarise the sales per store (prior to the analysis) CFAV_Summarise_store_sales.ipynb
To summarise all the results summarise_all_causal_results.ipynb
For Dunnhumby data, use Dunnhumby_summarise_all_causal_results.ipynb
To run the surrogate model experiment, Surrogate_model_experiment_paper.ipynb
To explain how to select the promos CFAV_show_selection_for_CausalImpact.ipynb
To produce the cannibalisation episode plot CFAV_CausalImpact_Analysis_Dairy_one_case(paper).ipynb
To produce the cannibalisation episode using the Dunnhumby data, Dunnhumby_CausalImpact_Analysis_Paper_plot.ipynb
To produce the graph used in the paper CFAV-causal_impact_GROCERY_I_Pichincha_49_A_11(graph-paper).ipynb
The repo is structured as follows:
.
├── README.md <- This file ;)
│
├── src
│ │
│ ├── notebooks <- Collection of notebooks
│ ├── notebooks/preprocessing_envelope_for_seasonality.ipynb <- STL preprocessing
│ ├── notebooks/
│ ├── notebooks/
To generate the package from the source
python3 setup.py sdist bdist_wheelThe wheel can be directly imported in Databricks.