Predictions and Test & Score: MAPE produces value where it should fail #7041
Labels
bug
A bug confirmed by the core team
needs discussion
Core developers need to discuss the issue
snack
This will take an hour or two
What's wrong?
The formula for the Mean Absolute Percentage Error (see here or here) produces a zero division if the test data contains actual values that are zero.
Yet Predictions and Test & Score produce a (very large) value for MAPE if there are zeroes in the test data.
How can we reproduce the problem?
In the attached workflow, Formula is used to compute the Absolute Percentage Error for every record, of which MAPE is the mean over all rows. As it should, Formula fails if rows where the target equals zero are included, but at the same time Predictions and Test & Score produce very large values for MAPE.
MAPE 0div.zip
Perhaps my idea to include MAPE wasn't a very good one, since it is a problematic measure. Perhaps consider adding sMAPE (Python code provided here) or even replacing MAPE by it.
The text was updated successfully, but these errors were encountered: