Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Duplicates in the Square catalog when hitting a rate limiting error #108

Open
a-danae opened this issue Apr 23, 2024 · 3 comments · May be fixed by #246
Open

Duplicates in the Square catalog when hitting a rate limiting error #108

a-danae opened this issue Apr 23, 2024 · 3 comments · May be fixed by #246
Assignees
Labels
priority: low The issue/PR is low priority—not many people are affected or there’s a workaround, etc. type: bug The issue is a confirmed bug.

Comments

@a-danae
Copy link

a-danae commented Apr 23, 2024

Describe the bug

During catalog syncing, duplicates are created in the Square catalog when hitting a rate limiting error on Square's end.

Square's support insights on the duplicates:

It does look like there is something off with these requests - the rate limiting error coming from us is not entirely correct, as some portion of the batch upsert request is successfully completing. This is why duplicates are being created; following the RATE_LIMITED error WooCommerce believes the entire request was not successful and retries it but some of items were already successfully created.

To reproduce

To be replicated on our end, but this is what the merchant is doing is having a big set of products sync with Square. Enough to hit the rate limit.

Expected behavior

If a rate limiting error is reached, there must not be duplicates when resuming the sync.

Additional details

From Nic Wilson:

The rate limit belongs to the Square account, which can be doing things other than interacting with WooCommerce. In this case, the account is also syncing the catalogue to Square Online. This means the limit is not something we can manage, so is not really our issue. Square acknowledge that.
However, the pickup and retry after the limit is applied is creating duplicate objects. This appears to be because the entire batch is rerun, even though half the batch may have been successful before failure. My guess is reusing the idempotency key helps prevent duplication of catalogue items.

Related idempotency key issue #106

8051230-zd-a8c
p1713482527612829-slack-C7U3Y3VMY

@nicdwilson
Copy link

Additional input from Square regarding 8051230-zd-a8c

Essentially the rate limiting error is not real in this case; one of the batches in the request is being rate limited by one of the prior batches not releasing the ‘catalog lock’ which is put into place for concurrency reasons.

This endpoint is also fairly problematic in general. Different batches in the request can succeed or fail independently, but for non 200 response scenarios the error handling is not sufficient for your client to determine which batches failed versus succeeded.

The team is currently considering the longer term fixes to this endpoint. But in the interim to get this merchant unblocked we recommended:

- when receiving a 429 or 5XX API response from Square, the integration should retry with the same idempotency_key used in the original request. This will prevent retries from creating duplicate items if some of the items were actually successfully created in the original request.

- adjust the integration to only upsert a single batch in each request, since the error handling for failure cases with multiple batches is not being done well on our side currently

Note related issue #106

@vikrampm1 vikrampm1 added priority: high The issue/PR is high priority—if affect lots of customers substantially, but not critically. type: bug The issue is a confirmed bug. labels Apr 24, 2024
@nicdwilson
Copy link

Working with the Square team, we reduced the batch size, which still produced rate limit errors. We then, on advice from Square, reduced the upsert to a single batch, which worked correctly.

In the end the workaround code used was

add_filter( 'wc_square_sync_max_objects_per_upsert', 'adjust_my_square_upserts_batch', 10, 1 );

function adjust_my_square_upserts_batch( $max ){
   //fiddle with this
   $max = 250;
   return $max;
}

add_filter( 'wc_square_sync_max_objects_per_upsert', 'adjust_my_square_upserts', 10, 1 );

function adjust_my_square_upserts( $objects_per_upsert ){
   //fiddle with this
   $objects_per_upsert = 1;
   return $objects_per_upsert;
}

@vikrampm1 vikrampm1 added priority: low The issue/PR is low priority—not many people are affected or there’s a workaround, etc. and removed priority: high The issue/PR is high priority—if affect lots of customers substantially, but not critically. labels Jun 14, 2024
@nicdwilson
Copy link

8901408-zd-a8c

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority: low The issue/PR is low priority—not many people are affected or there’s a workaround, etc. type: bug The issue is a confirmed bug.
Projects
None yet
4 participants