diff --git a/devrev-wrike-snapin.plain b/devrev-wrike-snapin.plain index 6f3d045..540c2b9 100644 --- a/devrev-wrike-snapin.plain +++ b/devrev-wrike-snapin.plain @@ -8,161 +8,323 @@ - The Space is a Wrike space. -- The Project is a Wrike project. +- The Data Model of The API is as follows: + ``` + The Data Model + ├─ The Account + │ └─ The List of Users + │ └─ The User + ├─ The List of Spaces + │ └─ The Space + │ └─ The List of Folders + │ └─ The Folder + │ └─ The List of Tasks + │ └─ The Task + │ ├─ The List of Comments + │ │ └─ The Comment + │ └─ The List of Attachments + | └─ The Attachment + ``` + +- The Fetched Users are The List of Users that are fetched from The API. + +- The Fetched Tasks are The List of Tasks that are fetched from The API. + +- The Fetched Comments are The List of Comments that are fetched from The API. + +- The Fetched Folders are The List of Folders that are fetched from The API. + +- The Fetched Attachments are The List of Attachments that are fetched from The API. One task from The Fetched Tasks can have multiple attachments. -- The Fetched Projects are The List of Projects that are fetched from The API. - -- The Fetched Contacts are The List of Contacts that are fetched from The API. +***Non-Functional Requirements:*** -- The Fetched Tasks are The The List of Tasks of The Project that are fetched from The API. +- {% include "templates/internal_client.plain", external_system_name: "Wrike" %} +- The Wrike API Key is provided in `event["payload"]["connection_data"]["key"]`. This is OAuth 2.0 Bearer token. -***Non-Functional Requirements:*** +***Test Requirements:*** -- The Wrike API Key is provided in the "key" field of the "connection_data" object which is part of "payload" object. +- Credentials should be read from the environment. The following environment variables are required: `WRIKE_API_KEY` (The Wrike API Key), `WRIKE_SPACE_ID` (The Space ID). -- The Space ID is provided in the "org_id" field of the "connection_data" object which is part of "payload" object. +- When generating The Acceptance Test with The Attached JSON Of The Acceptance Test, The Conformance Tests *MUST*: + - Replace `WRIKE_API_KEY` placeholder in The Attached JSON Of The Acceptance Test with actual The Wrike API Key. + - Replace `WRIKE_SPACE_ID` placeholder in The Attached JSON Of The Acceptance Test with actual The Space ID. +- "external_sync_unit_id" "IEAGS6BYI5RFMPP7" can be used when folder ID is required for testing purposes. -***Test Requirements:*** +- Task ID "IEAGS6BYKRRFMPQG" can be used when task ID is required for testing attachments. -- Credentials should be read from the environment. The following environment variables are required: `WRIKE_API_KEY` (The Wrike API Key) and `WRIKE_SPACE_GID` (The Space ID). +## Data Fetching -- "external_sync_unit_id" "IEAGS6BYI5RFMPPY" can be used when space ID is required for testing purposes. +***Definitions:*** -## The Boilerplate Code +- {% include "templates/data_fetching_return_expectation.plain" %} ***Non-Functional Requirements:*** -- The Postman collection [wrike_postman.json](wrike_postman.json) describes The API. +- The resource [wrike_postman.json](wrike_postman.json) describes The API. + - Based on the resource [wrike_postman.json](wrike_postman.json), calculate the `api_delay` for the Functions. -- The Postman Request Fields is a query parameter with key `fields` and value as a list of strings. The Postman Request Fields are optional and are defined per request basis. The Postman Request Fields for each request is determined solely by The Postman Collection. - -- If a given request does not have a query parameter with key `fields`, do not include it in the request. If a given request has a query parameter with key `fields`, include only the fields that are referenced by The Postman Request Fields for the given request in the "fields" query parameter. Do not include or make up any additional fields not specified in The Postman Collection. +- The Functions being implemented *must not* invoke The Spawn Method nor The Worker Thread. +- {% include "templates/should_not_modify_extraction_function.plain" %} ***Functional Requirements:*** -- Implement The Function that provides a check if authentication with The API works. +- Implement The Function "check_authentication" that provides a check if authentication with The API works. Authentication should be checked by making a request to the endpoint "/contacts", and providing "me=true" as a query parameter. + ***Acceptance Tests:*** -## Data Fetching + - Test the function "check_authentication". Expect the API response (`:= api_response`) to equal `api_response["data"][0]["id"]="KUAUZTPW"`. -***Non-Functional Requirements:*** + - {% include "templates/test_rate_limiting_during_data_extraction.plain", function_name: "check_authentication" %} -- The Postman collection [wrike_postman.json](wrike_postman.json) describes The API. +- Implement The Function called "fetch_space_folders" that uses The API to fetch The Fetched Folders using the endpoint "/spaces/{spaceId}/folders". + - You *should not* use `projects=true` query param. + - The Space ID is provided in `event["payload"]["connection_data"]["org_id"]`. -- The Postman Request Fields is a query parameter with key `fields` and value as a list of strings. The Postman Request Fields are optional and are defined per request basis. The Postman Request Fields for each request is determined solely by The Postman Collection. + ***Acceptance Tests:*** -- If a given request does not have a query parameter with key `fields`, do not include it in the request. If a given request has a query parameter with key `fields`, include only the fields that are referenced by The Postman Request Fields for the given request in the "fields" query parameter. Do not include or make up any additional fields not specified in The Postman Collection. + - Test The Function "fetch_space_folders". Expect the number of The Fetched Folders to be 3. -- The Mapping is a method in The Implementation Code that maps the fields in The API response to the fields in The Function's output. The Mapping should output "snake_case" for JSON keys. The Mapping should map every single field from The OpenAPI Specification to The Function's output. + - {% include "templates/test_rate_limiting_during_data_extraction.plain", function_name: "fetch_space_folders" %} -- If The Function uses The API to fetch data, The Mapping should be used along its rules. +- Implement The Function "fetch_users" that uses The API to fetch The List of Users (The Fetched Users) using the endpoint "/contacts?types=[Person]". -***Functional Requirements:*** + ***Acceptance Tests:*** + + - When using The Test Wrike Credentials, expect exactly 4 users in the result of The Function. + + - {% include "templates/test_rate_limiting_during_data_extraction.plain", function_name: "fetch_users" %} + +- Implement The Function called "fetch_folder_tasks" that uses The API to fetch The Fetched Tasks for a given folder using the endpoint "/folders/{folderId}/tasks". + - The Folder ID is provided in `event["payload"]["event_context"]["external_sync_unit_id"]`. + - One of the query params must be "fields=[responsibleIds]". + - "pageSize" and "nextPageToken" should be provided in `event["input_data"]["global_values"]["pageSize"]` and `event["input_data"]["global_values"]["nextPageToken"]`. + - The following query parameters should also be supported: + - "updatedDate" (optional, a timestamp in ISO 8601 UTC format, can be used for filtering tasks by updated date) -- Implement The Function that uses The API to fetch The List of Projects (The Fetched Projects) using the endpoint "/spaces/{spaceId}/folders?project=true". ***Acceptance Tests:*** - - When using The Test Wrike Credentials a project with the title "First project" must be in the result of The Function. + - Test The Function in multiple steps: + - Step 1: Call the "fetch_folder_tasks" function with Folder ID "IEAGS6BYI5RFMPP7" and "pageSize" of 100. + - Step 2: Expect the API response (`:= api_response`) to have `api_response["nextPageToken"]` field and `len(api_response["data"])=100` + - Step 3: Call the "fetch_folder_tasks" function again with Folder ID "IEAGS6BYI5RFMPP7", "pageSize" of 100 and "nextPageToken" value received on step 2. Expect the API response to have `len(api_response["data])=10`. -- Implement The Function that uses The API to fetch The List of Contacts of The Space (The Fetched Contacts). This list can be retrieved using the endpoint "spaces/{spaceId}?fields=[members]". The retrieved list contains objects representing contacts. The rest of information about contacts can be fetched from /contacts/{contactIds}?types=[Person]. + - Test The Function "fetch_folder_tasks" with Folder ID "IEAGS6BYI5RFMPP7". From API response (`:= api_response`), expect that every element from `api_response["data"]` contains field `"responsibleIds"`. + + - {% include "templates/test_rate_limiting_during_data_extraction.plain", function_name: "fetch_folder_tasks" %} + +- Implement The Function called "fetch_task_attachments" that uses The API to fetch The Fetched Attachments for a given task using the endpoint "/tasks/{taskId}/attachments", with the "withUrls" query parameter set to true. ***Acceptance Tests:*** - - When using The Test Wrike Credentials, The Function must return 3 members with their primaryEmail, firstName and lastName. + - Test The Function "fetch_task_attachments" with Task ID "IEAGS6BYKRRFMPQG". Let `api_response = `. Expect `len(api_response["data"])=1` and `api_response["data"][0]["name"]="Proof this image.jpg"` -- Implement The Function that uses The API to fetch The List of Tasks of The Project (The Fetched Tasks) using the endpoint "/folders/{projectId}/tasks". The projectId is provided in the "external_sync_unit_id" field of the "event_context" dictionary which is part of "payload" dictionary. + - {% include "templates/test_rate_limiting_during_data_extraction.plain", function_name: "fetch_task_attachments" %} + +- Implement The Function called "fetch_task_comments" that uses The API to fetch The Fetched Comments for a given task using the endpoint "/tasks/{taskId}/comments". ***Acceptance Tests:*** - - When using The Test Wrike Credentials and The Project ID "IEAGS6BYI5RFMPP7", 10 tasks should be fetched in the result of The Function. + - Test The Function "fetch_task_comments" with Task ID "IEAGS6BYKRRFMPQG". Expect the number of The Fetched Comments to be 2. -## Generate 'external_domain_metadata.json' + - {% include "templates/test_rate_limiting_during_data_extraction.plain", function_name: "fetch_task_comments" %} -***Definitions:*** -- The structure of The External Domain Metadata JSON object is specified by the JSON schema defined in the resource [external_domain_metadata_schema.json](external_domain_metadata_schema.json). - - Please note that all refers_to keys in reference and typed_reference fields must use the format "#record:", where matches a key in the top-level record_types. This ensures references are unambiguous and valid. Use {} as the value to refer by default identifier (id), or specify { "by_field": "field_name" } to use a custom key. - - If an external record type has some concept of states, use the resource [Metadata extraction](docs/metadata-extraction.mdx) for instructions on how to describe it. +## Generate 'initial_domain_mapping.json' and 'external_domain_metadata.json' -***Non-Functional Requirements:*** +### Generate 'external_domain_metadata.json' -- Store The External Domain Metadata JSON object as a separate JSON file. +{% include "templates/external_domain_metadata_boilerplate.plain" %} ***Functional Requirements:*** -- Implement The Function that generates and returns The External Domain Metadata JSON object. The record types that should be included in the The External Domain Metadata are The List of Tasks of The Project ('tasks') and The List of Contacts of The Space ('users'). +- Implement The Function 'get_external_domain_metadata' that generates and returns The External Domain Metadata JSON object. The External Domain Metadata JSON object should have the record type 'users'. + {% include "mappings/external_domain_metadata_users.plain" %} - ***Acceptance Tests:*** +- The External Domain Metadata JSON object should include the record type 'tasks', while preserving any existing record types. + {% include "mappings/external_domain_metadata_tasks.plain" %} + +- The External Domain Metadata JSON object should include the record type 'comments', while preserving any existing record types. + {% include "mappings/external_domain_metadata_comments.plain" %} + +### Generate 'initial_domain_mapping.json' - - Validate generated The External Domain Metadata JSON object with the chef-cli tool (The Chef CLI) using the command "chef-cli validate-metadata": - - This command reads JSON object from stdin. - - The path to The Chef CLI executable is provided in the environment variables `CHEF_CLI_PATH`. - - For the test to be successful The Chef CLI must return an empty output. - - The test should always fail if The Chef CLI is not available. +{% include "templates/initial_domain_mapping_boilerplate.plain" %} -## Generate 'initial_domain_mapping.json' +***Functional Requirements:*** + +- Implement The Function that generates and returns The Initial Domain Mapping JSON object. The Initial Domain Mapping JSON object should have record_type_mappings "users". + {% include "mappings/initial_domain_mapping_users.plain" %} + +- The Initial Domain Mapping JSON object should have record_type_mappings "tasks", while preserving any existing mappings. + {% include "mappings/initial_domain_mapping_tasks.plain" %} + +- The Initial Domain Mapping JSON object should have record_type_mappings "comments", while preserving any existing mappings. + {% include "mappings/initial_domain_mapping_comments.plain" %} + +## Pushing data to DevRev servers ***Definitions:*** -- The structure of The Initial Domain Mapping JSON object is specified by the JSON schema defined in the resource [initial_mappings_schema.yaml](initial_mappings_schema.yaml). - - For a complete list of supported DevRev object types and their fields, see resource [Supported DevRev object types for Airdrop](docs/supported-object-types.md). - - For information about transformation methods, see resource [Mapping Reasons](docs/mapping-reasons.mdx). - - When working with devrev_leaf_type, be aware that the schema expects different formats depending on context. In most places, it should be passed as an object with object_category and object_type. However, in the airdrop-recipe-create-possible-record-type-mapping context, it must be a string representing only the object_type. - - Please note that mappings are split into separate ‘shards’ - one for each record type - for easier manipulation and storage. - - Please note that a leaf type or a concrete id has to be selected for use_devrev_record, but not both. +- Resource [data-extraction.md](docs/data-extraction.md) should serve as a generic guide for the implementation of The Extraction Function. + +- Refer to the resource [data-extraction-rules-for-emitting-events.md](docs/data-extraction-rules-for-emitting-events.md) for the rules for emitting events for The Extraction Function. + +- TheExtractionStateObject is a state passed to The Worker Thread. The following structure for `TheExtractionStateObject` should be used: + - "users": + - "completed" (boolean, required, indicating if the users data has been fully pushed) + - "tasks": + - "completed" (boolean, required, indicating if the tasks data has been fully pushed) + - "nextPageToken" (string, optional, indicating the token to use when fetching the next page of The Fetched Tasks) + - "modifiedSince" (string, optional, timestamp used for incremental data synchronization to fetch only the tasks that have been updated since the last sync) + - "comments": + - "completed" (boolean, required, indicating if the comments data has been fully pushed) + - "attachments": + - "completed" (boolean, required, indicating if the attachments data has been fully pushed) ***Non-Functional Requirements:*** -- Store The Initial Domain Mapping JSON object as a separate JSON file. +- The resource [wrike_postman.json](wrike_postman.json) describes The API. + - The Normalization Function should take the relevant record type from The External Domain Metadata JSON object and map all resources from The API to the corresponding record type in The Normalization Function. +{% include "templates/spawn_method_instructions.plain" %} + +- Requests to The API *MUST NOT* be mocked. + +- If employing pagination, you should always set the parameter "limit" to 100 (`ThePaginationLimit`) when calling The API. + +- If employing pagination for during data extraction phase, here's how you should do it: + - While `TheExtractionStateObject[]["completed"]=false`: + - Fetch from The API (`:= TheExpectedResourceResponse`) with query params: + - `pageSize=ThePaginationLimit` + - `nextPageToken=TheExtractionStateObject[]["nextPageToken"]` if it exists and is not empty. + - If `len(TheExpectedResourceResponse["data"]) < ThePaginationLimit`: + - `TheExtractionStateObject[]["completed"]=true` + - `TheExtractionStateObject[]["nextPageToken"]=""` + - Else: + - `TheExtractionStateObject[]["nextPageToken"]=TheExpectedResourceResponse["nextPageToken"]` ***Functional Requirements:*** -- Implement The Function that generates and returns The Initial Domain Mapping JSON object. +> External sync units + +- If "event_type" equals "EXTRACTION_EXTERNAL_SYNC_UNITS_START", The Extraction Function should implement the "external sync units extraction" part of the extraction workflow as described in the resource [external-sync-units-extraction.mdx](docs/external-sync-units-extraction.mdx): + - Retrieve The Fetched Folders + - Push The Fetched Folders as external sync units using the following mapping from the fields in The Fetched Folders to the fields in The External Sync Units: + - "id" should map to "id". + - "title" should map to "name". + - "description" should map to "description". + - "item_type" should have a fixed value of "tasks". + - No other fields should be used. ***Acceptance Tests:*** - - Validate generated The Initial Domain Mapping JSON object with the chef-cli tool (The Chef CLI) using the command "chef-cli initial-mapping check -m ": - - The Chef CLI reads The Initial Domain Mapping JSON object from stdin. - - The path to The Chef CLI executable is provided in the environment variables `CHEF_CLI_PATH`. - - The test passes only if the "RemainingDeficiencies" key in the returned JSON output has a value of null. If the value is anything other than null the test must fail. - - The test should always fail if The Chef CLI is not available. + - {% include "templates/external_sync_unit_acceptance_test.plain", resource_name: "external_sync_unit_check.json", expected_external_sync_unit_count: 3, expected_external_sync_unit_name: "First project" %} -## Pushing data to DevRev servers +> Metadata extraction -***Non-Functional Requirements:*** +- If "event_type" equals "EXTRACTION_METADATA_START" The Extraction Function should implement the "metadata extraction" part of the extraction workflow as described in the resource [metadata-extraction.md](docs/metadata-extraction.mdx). Please note that The External Domain Metadata JSON should not be normalized when pushed to the repository. -- All operations accessing external services or data providers must be implemented in worker threads. -***Functional Requirements:*** +> Data extraction - USERS -- When spawning a new worker, pass the parameter object to the spawn function that includes the "initialDomainMapping" key with The Initial Domain Mapping JSON object as the value. Do not pass this under "options" key but directly in the top-most level of the parameter object. +- If "event_type" equals "EXTRACTION_DATA_START" or "EXTRACTION_DATA_CONTINUE" The Extraction Function should push the 'users' data. To push the 'users' data, it should: + - If `TheExtractionStateObject["users"]["completed"]=false`: + - Fetch The Fetched Users + - Push The Fetched Users to the repository designated for 'users' data. + - Set `TheExtractionStateObject["users"]["completed"]=true` + + ***Acceptance Tests:*** -- When spawning a new worker, pass the parameter object to the spawn function that does not contain the "options" key. + - {% include "templates/data_extraction_acceptance_test.plain", resource_name: "data_extraction_test.json", artifact_name: "users", expected_item_count: 4 %} -- Implement The Function named "extraction" (The Extraction Function) that pushes The Fetched Projects as external sync units if "event_type" equals "EXTRACTION_EXTERNAL_SYNC_UNITS_START". + - {% include "templates/test_rate_limiting_during_emitting_events.plain", input_event_name: "EXTRACTION_DATA_START", resource_name: "data_extraction_test.json", expected_output_event_type: "EXTRACTION_DATA_DELAY" %} + + - Validate The Normalization Function with the chef-cli tool (The Chef CLI) for the "users" record type: + - Firstly, invoke The Extraction Function using the resource [data_extraction_test.json](test_data/data_extraction_test.json). + {% include "templates/chef_cli_normalization_validation.plain", record_type: "users" %} + + +> Data extraction - TASKS + +- If "event_type" equals "EXTRACTION_DATA_START" or "EXTRACTION_DATA_CONTINUE" The Extraction Function should push the 'tasks' and 'users' data. To push the 'tasks' data, it should: + - If `TheExtractionStateObject["tasks"]["completed"]=false`: + - Fetch The Fetched Tasks using pagination (The Tasks Iteration). For every page: + - Push the Fetched Tasks to repository designated for 'tasks' data. + - Update the "nextPageToken" parameter for "tasks". + - Mark the "tasks" data as completed. + - Wrike specific: When fetching The Fetched Tasks use the query param 'fields', where the value is an array of strings, which contains 'hasAttachments' (but it can also contain other values). ***Acceptance Tests:*** - - Test The Extraction Function using the resource [external_sync_unit_check.json](test_data/external_sync_unit_check.json). Make sure to replace placeholders in the resource with The Wrike API Key and The Space ID. For the test to be successful The Callback Server **must** receive from DevRev **exactly one** event with "event_type" that equals "EXTRACTION_EXTERNAL_SYNC_UNITS_DONE". The test must not send the event directly to The Callback Server. + - {% include "templates/data_extraction_acceptance_test.plain", resource_name: "data_extraction_test.json", artifact_name: "tasks", expected_item_count: 110 %} -- If "event_type" equals "EXTRACTION_EXTERNAL_SYNC_UNITS_START" The Extraction Function should fetch the tasks count for each project and push it as part of the external sync units. + - {% include "templates/data_extraction_acceptance_test.plain", resource_name: "data_extraction_continue_test.json", artifact_name: "tasks", expected_item_count: 110 %} -- If "event_type" equals "EXTRACTION_METADATA_START" The Extraction Function should implement the "metadata extraction" part of the extraction workflow by pushing The External Domain Metadata JSON object to the repository called 'external_domain_metadata'. Please note that The External Domain Metadata JSON object shouldn't be normalized when pushed to the repository. + - {% include "templates/test_rate_limiting_during_emitting_events.plain", input_event_name: "EXTRACTION_DATA_START", resource_name: "data_extraction_test.json", expected_output_event_type: "EXTRACTION_DATA_DELAY" %} -- If "event_type" equals "EXTRACTION_DATA_START" The Extraction Function should: - - push The Fetched Contacts to the repository named 'users' - - push The Fetched Tasks to the repository designated for 'tasks' data - (but make sure that a single "EXTRACTION_DATA_DONE" event is emitted) + - Validate The Normalization Function with the chef-cli tool (The Chef CLI) for the "tasks" record type: + - Firstly, invoke The Extraction Function using the resource [data_extraction_test.json](test_data/data_extraction_test.json). + {% include "templates/chef_cli_normalization_validation.plain", record_type: "tasks" %} + + +> Data extraction - TASK ATTACHMENTS + +- If "event_type" equals "EXTRACTION_DATA_START" or "EXTRACTION_DATA_CONTINUE" The Extraction Function should push the 'attachments', 'tasks' and 'users' data. To push the 'attachments' data, it should: + - If `TheExtractionStateObject["tasks"]["completed"]=false`: + - Inside The Tasks Iteration. For every page: + - Extract The Fetched Attachments from The Fetched Tasks. + - Push The Fetched Attachments to the repository designated for 'attachments' data. + - Mark the "attachments" data as completed. ***Acceptance Tests:*** - - Test The Extraction Function using the resource [data_extraction_test.json](test_data/data_extraction_test.json). Test is successful if The Callback Server receives from DevRev a **single** event with "event_type" that equals "EXTRACTION_DATA_DONE". The test must not send event directly to The Callback Server. + - {% include "templates/data_extraction_acceptance_test.plain", resource_name: "data_extraction_test.json", artifact_name: "attachments", expected_item_count: 2 %} + +> Data extraction - TASK COMMENTS - - Test The Extraction Function using the resource [data_extraction_test.json](test_data/data_extraction_test.json). Test is successful if The Callback Server does not receive from DevRev any event with "event_type" that equals "EXTRACTION_DATA_ERROR". The test must not send event directly to The Callback Server. +- If "event_type" equals "EXTRACTION_DATA_START" or "EXTRACTION_DATA_CONTINUE" The Extraction Function should push the 'users', 'tasks', 'attachments' and 'comments' data. To push the 'comments' data, it should: + - If `TheExtractionStateObject["comments"]["completed"]=false`: + - Inside The Tasks Iteration. For every page: + - Extract The Fetched Comments from The Fetched Tasks. + - Push The Fetched Comments to the repository designated for 'comments' data. + - Mark the "comments" data as completed. + + ***Acceptance Tests:*** + + - {% include "templates/data_extraction_acceptance_test.plain", resource_name: "data_extraction_test.json", artifact_name: "comments", expected_item_count: 2 %} + + - {% include "templates/test_rate_limiting_during_emitting_events.plain", input_event_name: "EXTRACTION_DATA_START", resource_name: "data_extraction_test.json", expected_output_event_type: "EXTRACTION_DATA_DELAY" %} + + +> Attachments extraction - If "event_type" equals "EXTRACTION_ATTACHMENTS_START" or "EXTRACTION_ATTACHMENTS_CONTINUE" The Extraction Function should implement attachment extraction as described in the resource [attachments-extraction.md](docs/attachments-extraction.md). + + ***Acceptance Tests:*** + + - {% include "templates/attachment_extraction_acceptance_test.plain", data_extraction_resource_name: "data_extraction_test.json", attachment_extraction_resource_name: "attachments_extraction_test.json", expected_attachment_count: 2 %} + + - Test The Extraction Function using the resource [attachments_extraction_continue_test.json](test_data/attachments_extraction_continue_test.json). + - Expect The Callback Server to receive from DevRev a **single** event with "event_type" that equals "EXTRACTION_ATTACHMENTS_DONE". + + - {% include "templates/test_rate_limiting_during_emitting_events.plain", input_event_name: "EXTRACTION_ATTACHMENTS_START", resource_name: "attachments_extraction_test.json", expected_output_event_type: "EXTRACTION_ATTACHMENTS_DONE" %} + + +> INCREMENTAL MODE + +> - If "event_type" equals "EXTRACTION_DATA_START", The Extraction Function should support incremental data synchronization as described in the resource [incremental_mode.md](docs/incremental_mode.md). +> - Incremental mode should only work for "tasks" and their corresponding "attachments" and "comments" data. If `event["payload"]["event_context"]["mode"]=SyncMode.INCREMENTAL`, set: +> - `TheExtractionStateObject["tasks"]["modifiedSince"]=adapter.state.lastSuccessfulSyncStarted` +> - `TheExtractionStateObject["tasks"]["completed"]=false` +> - `TheExtractionStateObject["attachments"]["completed"]=false` +> - `TheExtractionStateObject["comments"]["completed"]=false` +> - Based on the field "updatedDate", you should adjust the API call to fetch only the The Fetched Tasks that have been updated after the time of the last successful sync. +> - Note: In incremental mode, you should push only the filtered tasks and their corresponding attachments and comments to the DevRev servers. + +> ***Acceptance Tests:*** + +> include "templates/incremental_mode_acceptance_tests.plain" \ No newline at end of file diff --git a/mappings/external_domain_metadata_comments.plain b/mappings/external_domain_metadata_comments.plain new file mode 100644 index 0000000..3065a7f --- /dev/null +++ b/mappings/external_domain_metadata_comments.plain @@ -0,0 +1,6 @@ +- The record type 'comments' (Name: Comments) should have the following fields: + - text (display name: "Text", is required, type: rich text) + - author_id (display name: "Author ID", is required, type: reference) + - Field author_id refers to the record type "#record:users". + - task_id (display name: "Task ID", is required, type: reference) + - Field task_id refers to the record type "#record:tasks". diff --git a/mappings/external_domain_metadata_tasks.plain b/mappings/external_domain_metadata_tasks.plain new file mode 100644 index 0000000..8cadbaf --- /dev/null +++ b/mappings/external_domain_metadata_tasks.plain @@ -0,0 +1,8 @@ +- The record type 'tasks' (Name: Tasks) should have the following fields: + - title (display name: "Title", is required, type: text) + - description (display name: "Description", is required, type: rich text) + - status (display name: "Status", is required, type: enum) + - permalink (display name: "URL", is required, type: text) + - responsible_ids (display name: "Responsible IDs", is required, type: reference) + - Field responsible_ids refers to the record type "#record:users". + - Type of field responsible_ids is an array with max_length 1, which should be used as array value. \ No newline at end of file diff --git a/mappings/external_domain_metadata_users.plain b/mappings/external_domain_metadata_users.plain new file mode 100644 index 0000000..9b3b1bc --- /dev/null +++ b/mappings/external_domain_metadata_users.plain @@ -0,0 +1,4 @@ +- The record type 'users' (Name: Users) should have the following fields: + - full_name (display name: "Full Name", is required, type: text) + - email (display name: "Email", is required, type: text) + - title (display name: "Title", is optional, type: text) \ No newline at end of file diff --git a/mappings/initial_domain_mapping_comments.plain b/mappings/initial_domain_mapping_comments.plain new file mode 100644 index 0000000..245f2b0 --- /dev/null +++ b/mappings/initial_domain_mapping_comments.plain @@ -0,0 +1,11 @@ +- The record_type_mappings "comments" should have the following properties: + - Default mapping should map each external comment to a "comment" object. + - There should be a single "possible_record_type_mappings" element, specifying: + - The mapping is one-way (reverse is false, forward is true). + - There should be no custom fields in the mapping. + - The following The Stock Field Mapping Fields should be mapped using The External Transformation Method: + - field "text" should be mapped to "body" (rich text). + - field "author_id" should be mapped to "created_by_id" (use directly). + - field "author_id" should be mapped to "modified_by_id" (use directly). + - field "task_id" should be mapped to "parent_object_id" (use_directly). + diff --git a/mappings/initial_domain_mapping_tasks.plain b/mappings/initial_domain_mapping_tasks.plain new file mode 100644 index 0000000..f9c81da --- /dev/null +++ b/mappings/initial_domain_mapping_tasks.plain @@ -0,0 +1,20 @@ +- The record_type_mappings "tasks" should have the following properties: + - Default mapping should map each external task to a "issue" object. + - There should be a single "possible_record_type_mappings" element, specifying: + - The mapping is one-way (reverse is false, forward is true) + - There should be no custom fields in the mapping. + - The following The Stock Field Mapping Fields should be mapped using The External Transformation Method: + - field "title" should be mapped to "title" + - field "permalink" should be mapped to "item_url_field" + - field "description" should be mapped to "body" (rich text) + - field "responsible_ids" should be mapped to "owned_by_ids" (use directly) + - The following The Stock Field Mapping Fields should be mapped using The Fixed Transformation Method: + - field "priority" should contain fixed value "P2" + - The following The Stock Field Mapping Fields should be mapped using The DevRev Record Transformation Method: + - field "applies_to_part_id" should refer to the "product" object type + - The following The Stock Field Mapping Fields should be mapped using The Map Enum Transformation Method: + - field "status" should be mapped to "stage" in the following way: + - "Active" maps to "in_development" + - "Completed" maps to "completed" + - "Deferred" maps to "backlog" + - "Cancelled" maps to "wont_fix" \ No newline at end of file diff --git a/mappings/initial_domain_mapping_users.plain b/mappings/initial_domain_mapping_users.plain new file mode 100644 index 0000000..9e321ab --- /dev/null +++ b/mappings/initial_domain_mapping_users.plain @@ -0,0 +1,9 @@ +- The record_type_mappings "users" should have the following properties: + - Default mapping should map each external user to a "devu" user object. + - There should be a single "possible_record_type_mappings" element, specifying: + - The mapping is one-way (reverse is false, forward is true). + - There should be no custom fields in the mapping. + - The following The Stock Field Mapping Fields should be mapped using The External Transformation Method: + - field "full_name" should be mapped to "full_name". + - field "email" should be mapped to "email". + - field "title" should be mapped to "display_name". \ No newline at end of file diff --git a/rate_limiting_proxy.py b/rate_limiting_proxy.py new file mode 100644 index 0000000..b19c317 --- /dev/null +++ b/rate_limiting_proxy.py @@ -0,0 +1,356 @@ +import socket +import threading +import socketserver +import time +import sys +import ssl +import json +import datetime +import email.utils +from urllib.parse import urlparse + +# Rate limiting settings +TOKEN_BUCKET_CAPACITY = 100 # requests +REFILL_RATE = 10 # requests per second + +# ============================================================================ +# SERVICE-SPECIFIC CONFIGURATION: Customize this section for your integration +# ============================================================================ +# This configuration mimics Trello's rate limiting response format. +# When adapting this proxy for a different third-party service, modify these +# settings to match that service's 429 response behavior. +# ============================================================================ + +RATE_LIMIT_DELAY = 3 # seconds - Time to wait before retrying + +class RateLimiterState: + """A thread-safe class to manage the global rate limiting state.""" + def __init__(self): + self.lock = threading.Lock() + self.rate_limiting_active = False + self.test_name = None + + def start_rate_limiting(self, test_name): + with self.lock: + self.rate_limiting_active = True + self.test_name = test_name + + def end_rate_limiting(self): + with self.lock: + self.rate_limiting_active = False + self.test_name = None + + def is_rate_limiting_active(self): + with self.lock: + return self.rate_limiting_active, self.test_name + +rate_limiter_state = RateLimiterState() + +class TokenBucket: + """A thread-safe token bucket for rate limiting.""" + def __init__(self, capacity, refill_rate): + self.capacity = float(capacity) + self.refill_rate = float(refill_rate) + self.tokens = float(capacity) + self.last_refill = time.time() + self.lock = threading.Lock() + + def consume(self, tokens): + """Consumes tokens from the bucket. Returns True if successful, False otherwise.""" + with self.lock: + now = time.time() + time_since_refill = now - self.last_refill + new_tokens = time_since_refill * self.refill_rate + self.tokens = min(self.capacity, self.tokens + new_tokens) + self.last_refill = now + + if self.tokens >= tokens: + self.tokens -= tokens + return True + return False + +rate_limiter = TokenBucket(TOKEN_BUCKET_CAPACITY, REFILL_RATE) + +def create_rate_limit_response(): + """ + TODO: Adopt this based on the 3rd party service's rate limiting response format. + + ======================================================================== + SERVICE-SPECIFIC: Customize this function for your third-party service + ======================================================================== + + Generates the 429 Rate Limit response matching the third-party service's + format. Different services may use different: + - Response body structures (e.g., {"detail": "..."} vs {"error": "..."}) + - Retry-After header formats (HTTP date vs seconds) + - Error messages and field names + + This implementation matches Trello's rate limiting response format. + + Returns: + tuple: (status_code, status_message, response_body_dict, headers_dict) + """ + retry_after = RATE_LIMIT_DELAY + + response_body = { + "errorDescription": "Rate limit exceeded, try again later", + "error": "rate_limit_exceeded" + } + headers = {"Retry-After": retry_after} + + return 429, "Too Many Requests", response_body, headers + +class ProxyHandler(socketserver.BaseRequestHandler): + """Handles incoming proxy requests.""" + def handle(self): + if not rate_limiter.consume(1): + print("Rate limit exceeded. Dropping connection.") + try: + self.request.sendall(b'HTTP/1.1 429 Too Many Requests\r\n\r\n') + except OSError: + pass # Client might have already closed the connection. + finally: + self.request.close() + return + + try: + data = self.request.recv(4096) + except ConnectionResetError: + return # Client closed connection. + + if not data: + return + + first_line = data.split(b'\r\n')[0] + try: + method, target, _ = first_line.split() + except ValueError: + print(f"Could not parse request: {first_line}") + self.request.close() + return + + print(f"Received request: {method.decode('utf-8')} {target.decode('utf-8')}") + + path = target.decode('utf-8') + # Check for control plane endpoints on the proxy itself + if path.startswith(('/start_rate_limiting', '/end_rate_limiting')): + self.handle_control_request(method, path, data) + return + + # Check if global rate limiting is active + is_active, test_name = rate_limiter_state.is_rate_limiting_active() + if is_active: + print(f"Rate limiting is active for test: '{test_name}'. Blocking request.") + + # Generate service-specific rate limit response + status_code, status_message, response_body, headers = create_rate_limit_response() + self.send_json_response(status_code, status_message, response_body, headers=headers) + return + + if method == b'CONNECT': + self.handle_connect(target) + else: + self.handle_http_request(target, data) + + def get_request_body(self, data): + header_end = data.find(b'\r\n\r\n') + if header_end != -1: + return data[header_end + 4:].decode('utf-8') + return "" + + def send_json_response(self, status_code, status_message, body_json, headers=None): + body_bytes = json.dumps(body_json).encode('utf-8') + + response_headers = [ + f"HTTP/1.1 {status_code} {status_message}", + "Content-Type: application/json", + f"Content-Length: {len(body_bytes)}", + "Connection: close", + ] + + if headers: + for key, value in headers.items(): + response_headers.append(f"{key}: {value}") + + response_headers.append("") + response_headers.append("") + + response = '\r\n'.join(response_headers).encode('utf-8') + body_bytes + try: + self.request.sendall(response) + except OSError: + pass # Client might have closed the connection. + finally: + self.request.close() + + def handle_control_request(self, method, path, data): + if method != b'POST': + self.send_json_response(405, "Method Not Allowed", {"error": "Only POST method is allowed"}) + return + + if path == '/start_rate_limiting': + body_str = self.get_request_body(data) + if not body_str: + self.send_json_response(400, "Bad Request", {"error": "Request body is missing or empty"}) + return + try: + body_json = json.loads(body_str) + test_name = body_json.get('test_name') + if not test_name or not isinstance(test_name, str): + self.send_json_response(400, "Bad Request", {"error": "'test_name' is missing or not a string"}) + return + except json.JSONDecodeError: + self.send_json_response(400, "Bad Request", {"error": "Invalid JSON in request body"}) + return + + rate_limiter_state.start_rate_limiting(test_name) + response_body = {"status": f"rate limiting started for test: {test_name}"} + self.send_json_response(200, "OK", response_body) + + elif path == '/end_rate_limiting': + rate_limiter_state.end_rate_limiting() + response_body = {"status": "rate limiting ended"} + self.send_json_response(200, "OK", response_body) + else: + self.send_json_response(404, "Not Found", {"error": "Endpoint not found"}) + + def handle_http_request(self, target, data): + """Handles HTTP requests like GET, POST, etc.""" + try: + parsed_url = urlparse(target.decode('utf-8')) + host = parsed_url.hostname + port = parsed_url.port + if port is None: + port = 443 if parsed_url.scheme == 'https' else 80 + except Exception as e: + print(f"Could not parse URL for HTTP request: {target}. Error: {e}") + self.request.close() + return + + if not host: + print(f"Invalid host in URL: {target}") + self.request.close() + return + + try: + remote_socket = socket.create_connection((host, port), timeout=10) + if parsed_url.scheme == 'https': + context = ssl.create_default_context() + remote_socket = context.wrap_socket(remote_socket, server_hostname=host) + except (socket.error, ssl.SSLError) as e: + print(f"Failed to connect or SSL wrap to {host}:{port}: {e}") + self.request.close() + return + + # Modify the request to use a relative path and force connection closing + # This ensures each request gets its own connection and is logged. + header_end = data.find(b'\r\n\r\n') + if header_end == -1: + # If no header-body separator is found, assume it's a simple request with no body. + header_end = len(data) + + header_data = data[:header_end] + body = data[header_end:] + + lines = header_data.split(b'\r\n') + first_line = lines[0] + headers = lines[1:] + + method, _, http_version = first_line.split(b' ', 2) + + path = parsed_url.path or '/' + if parsed_url.query: + path += '?' + parsed_url.query + + new_first_line = b' '.join([method, path.encode('utf-8'), http_version]) + + new_headers = [] + for header in headers: + # Remove existing connection-related headers, as we're forcing it to close. + if not header.lower().startswith(b'connection:') and \ + not header.lower().startswith(b'proxy-connection:'): + new_headers.append(header) + new_headers.append(b'Connection: close') + + modified_header_part = new_first_line + b'\r\n' + b'\r\n'.join(new_headers) + modified_request = modified_header_part + body + + try: + remote_socket.sendall(modified_request) + except OSError: + remote_socket.close() + return + + self.tunnel(self.request, remote_socket) + + def handle_connect(self, target): + """Handles CONNECT requests for HTTPS traffic.""" + try: + host, port_str = target.split(b':') + port = int(port_str) + except ValueError: + print(f"Invalid target for CONNECT: {target}") + self.request.close() + return + + try: + remote_socket = socket.create_connection((host.decode('utf-8'), port), timeout=10) + except socket.error as e: + print(f"Failed to connect to {host.decode('utf-8')}:{port}: {e}") + self.request.close() + return + + try: + self.request.sendall(b'HTTP/1.1 200 Connection Established\r\n\r\n') + except OSError: + remote_socket.close() + return + + self.tunnel(self.request, remote_socket) + + def tunnel(self, client_socket, remote_socket): + """Tunnels data between the client and the remote server.""" + stop_event = threading.Event() + + def forward(src, dst): + try: + while not stop_event.is_set(): + data = src.recv(4096) + if not data: + break + dst.sendall(data) + except OSError: + pass + finally: + stop_event.set() + + client_thread = threading.Thread(target=forward, args=(client_socket, remote_socket)) + remote_thread = threading.Thread(target=forward, args=(remote_socket, client_socket)) + + client_thread.start() + remote_thread.start() + + client_thread.join() + remote_thread.join() + + client_socket.close() + remote_socket.close() + +class ThreadingTCPServer(socketserver.ThreadingMixIn, socketserver.TCPServer): + daemon_threads = True + allow_reuse_address = True + +def main(): + HOST, PORT = "localhost", 8004 + + try: + server = ThreadingTCPServer((HOST, PORT), ProxyHandler) + print(f"Starting proxy server on {HOST}:{PORT}") + server.serve_forever() + except Exception as e: + print(f"Could not start proxy server: {e}", file=sys.stderr) + # The script `run_devrev_snapin_conformance_tests.sh` checks for exit code 69. + sys.exit(69) + +if __name__ == "__main__": + main() \ No newline at end of file diff --git a/templates/incremental_mode_acceptance_tests.plain b/templates/incremental_mode_acceptance_tests.plain new file mode 100644 index 0000000..9c19936 --- /dev/null +++ b/templates/incremental_mode_acceptance_tests.plain @@ -0,0 +1,31 @@ +- Validate the incremental mode with the following flow: + - Step 1: + - Execute request `curl -X POST "http://localhost:8003/external-worker.update-last-successful-sync/" -H "Content-Type: application/json" -d '{"snap_in_version_id":"test-version-id","extend_state":{"users":{"completed":true},"tasks":{"completed":true},"attachments":{"completed":true},"comments":{"completed":true}}}'`, where `` is `event["payload"]["event_context"]["sync_unit_id"]` in the resource [data_extraction_test.json](test_data/data_extraction_test.json). + - Step 2: + - Execute request `curl -g -X PUT -H 'Authorization: bearer ' -d 'title=' 'https://www.wrike.com/api/v4/tasks/MAAAAABlGv5B'` (replace and with the actual values). + - Expect the request to succeed. + - Step 3: + - Invoke The Extraction Function using the resource [data_extraction_incremental_test.json](test_data/data_extraction_incremental_test.json). + - Expect The Callback Server to receive from DevRev a **single** event with "event_type" that equals "EXTRACTION_DATA_DONE". + - Let `artifact_array = ["event_data"]["artifacts"]` (array of objects). + - Expect `artifact_array` to not be empty array. + - Out of `artifact_array`, expect one of the elements to have "item_type" equal to "tasks" (`:= tasks_artifact`). Expect `tasks_artifact["item_count"]=1`. + - Expect that there is no element in `artifact_array` with "item_type" equal to "attachments". If there is, this indicates that the attachments data was pushed to the DevRev servers, which is wrong. + - Expect that there is no element in `artifact_array` with "item_type" equal to "users". If there is, this indicates that the users data was pushed to the DevRev servers, which is wrong. + - Expect that there is no element in `artifact_array` with "item_type" equal to "comments". If there is, this indicates that the comments data was pushed to the DevRev servers, which is wrong. + +- Validate the incremental mode with the following flow: + - Step 1: + - Execute request `curl -X POST "http://localhost:8003/external-worker.update-last-successful-sync/" -H "Content-Type: application/json" -d '{"snap_in_version_id":"test-version-id","extend_state":{"users":{"completed":true},"tasks":{"completed":true},"attachments":{"completed":true},"comments":{"completed":true}}}'`, where `` is `event["payload"]["event_context"]["sync_unit_id"]` in the resource [data_extraction_test.json](test_data/data_extraction_test.json). + - Step 2: + - Execute request `curl -g -X PUT -H 'Authorization: bearer ' -d 'title=' 'https://www.wrike.com/api/v4/tasks/IEAGS6BYKRRFMPQE'` (replace and with the actual values). + - Expect the request to succeed. + - Step 3: + - Invoke The Extraction Function using the resource [data_extraction_incremental_test.json](test_data/data_extraction_incremental_test.json). + - Expect The Callback Server to receive from DevRev a **single** event with "event_type" that equals "EXTRACTION_DATA_DONE". + - Let `artifact_array = ["event_data"]["artifacts"]` (array of objects). + - Expect `artifact_array` to not be empty array. + - Out of `artifact_array`, expect one of the elements to have "item_type" equal to "tasks" (`:= tasks_artifact`). Expect `tasks_artifact["item_count"]=1`. + - Out of `artifact_array`, expect one of the elements to have "item_type" equal to "attachments" (`:= attachments_artifact`). Expect `attachments_artifact["item_count"]=1`. + - Out of `artifact_array`, expect one of the elements to have "item_type" equal to "comments" (`:= comments_artifact`). Expect `comments_artifact["item_count"]=3`. + - Expect that there is no element in `artifact_array` with "item_type" equal to "users". If there is, this indicates that the users data was pushed to the DevRev servers, which is wrong. \ No newline at end of file diff --git a/test_data/attachments_extraction_continue_test.json b/test_data/attachments_extraction_continue_test.json new file mode 100644 index 0000000..e14acc1 --- /dev/null +++ b/test_data/attachments_extraction_continue_test.json @@ -0,0 +1,75 @@ +[ + { + "payload": { + "connection_data": { + "key": "", + "key_type": "", + "org_id": "", + "org_name": "First Space" + }, + "event_context": { + "callback_url": "http://localhost:8002/callback", + "dev_oid": "DEV-36shCCBEAA", + "dev_org": "DEV-36shCCBEAA", + "dev_org_id": "DEV-36shCCBEAA", + "dev_uid": "DEVU-6", + "dev_user": "DEVU-6", + "dev_user_id": "DEVU-6", + "event_type_adaas": "", + "external_sync_unit": "688725dad59c015ce052eecf", + "external_sync_unit_id": "688725dad59c015ce052eecf", + "external_sync_unit_name": "cards-pagination-test-2025-07-28-092514", + "external_system": "6752eb95c833e6b206fcf388", + "external_system_id": "6752eb95c833e6b206fcf388", + "external_system_name": "Wrike", + "external_system_type": "ADaaS", + "import_slug": "wrike-snapin-devrev", + "initial_sync_scope": "full-history", + "mode": "INITIAL", + "request_id": "ff894fd5-2290-42bb-9f89-0785e49b4049", + "request_id_adaas": "ff894fd5-2290-42bb-9f89-0785e49b4049", + "run_id": "cbbe2419-1f86-4737-aa78-6bb7118ce52c", + "sequence_version": "17", + "snap_in_slug": "wrike-snapin-devrev", + "snap_in_version_id": "don:integration:dvrv-eu-1:devo/36shCCBEAA:snap_in_package/787b97af-95a8-4b57-809e-8d55f4e72f40:snap_in_version/50d4660e-dad9-41D6-9169-8a7e96b2d7fa", + "sync_run": "cbbe2419-1f86-4737-aa78-6bb7118ce52c", + "sync_run_id": "cbbe2419-1f86-4737-aa78-6bb7118ce52c", + "sync_tier": "sync_tier_2", + "sync_unit": "don:integration:dvrv-eu-1:devo/36shCCBEAA:external_system_type/ADAAS:external_system/6752eb95c833e6b206fcf388:sync_unit/984c894e-71e5-4e94-b484-40b839c9a916", + "sync_unit_id": "984c894e-71e5-4e94-b484-40b839c9a916", + "uuid": "ff894fd5-2290-42bb-9f89-0785e49b4049", + "worker_data_url": "http://localhost:8003/external-worker" + }, + "event_type": "EXTRACTION_ATTACHMENTS_CONTINUE" + }, + "context": { + "dev_oid": "don:identity:dvrv-eu-1:devo/36shCCBEAA", + "automation_id": "", + "source_id": "", + "snap_in_id": "don:integration:dvrv-eu-1:devo/36shCCBEAA:snap_in/04bf12fa-57bd-4057-b0b0-ed3f42d9813e", + "snap_in_version_id": "don:integration:dvrv-eu-1:devo/36shCCBEAA:snap_in_package/787b97af-95a8-4b57-809e-8d55f4e72f40:snap_in_version/50d4660e-dad9-41D6-9169-8a7e96b2d7fa", + "service_account_id": "don:identity:dvrv-eu-1:devo/36shCCBEAA:svcacc/101", + "secrets": { + "service_account_token": "test-service-account-token" + }, + "user_id": "don:identity:dvrv-eu-1:devo/36shCCBEAA:devu/6", + "event_id": "", + "execution_id": "4481432207487786275" + }, + "execution_metadata": { + "request_id": "ff894fd5-2290-42bb-9f89-0785e49b4049", + "function_name": "extraction", + "event_type": "EXTRACTION_ATTACHMENTS_START", + "devrev_endpoint": "http://localhost:8003" + }, + "input_data": { + "global_values": {}, + "event_sources": {}, + "keyrings": null, + "resources": { + "keyrings": {}, + "tags": {} + } + } + } +] \ No newline at end of file diff --git a/test_data/attachments_extraction_test.json b/test_data/attachments_extraction_test.json new file mode 100644 index 0000000..12b993d --- /dev/null +++ b/test_data/attachments_extraction_test.json @@ -0,0 +1,75 @@ +[ + { + "payload": { + "connection_data": { + "key": "", + "key_type": "", + "org_id": "", + "org_name": "First Space" + }, + "event_context": { + "callback_url": "http://localhost:8002/callback", + "dev_oid": "DEV-36shCCBEAA", + "dev_org": "DEV-36shCCBEAA", + "dev_org_id": "DEV-36shCCBEAA", + "dev_uid": "DEVU-6", + "dev_user": "DEVU-6", + "dev_user_id": "DEVU-6", + "event_type_adaas": "", + "external_sync_unit": "688725dad59c015ce052eecf", + "external_sync_unit_id": "688725dad59c015ce052eecf", + "external_sync_unit_name": "cards-pagination-test-2025-07-28-092514", + "external_system": "6752eb95c833e6b206fcf388", + "external_system_id": "6752eb95c833e6b206fcf388", + "external_system_name": "Wrike", + "external_system_type": "ADaaS", + "import_slug": "wrike-snapin-devrev", + "initial_sync_scope": "full-history", + "mode": "INITIAL", + "request_id": "ff894fd5-2290-42bb-9f89-0785e49b4049", + "request_id_adaas": "ff894fd5-2290-42bb-9f89-0785e49b4049", + "run_id": "cbbe2419-1f86-4737-aa78-6bb7118ce52c", + "sequence_version": "17", + "snap_in_slug": "wrike-snapin-devrev", + "snap_in_version_id": "don:integration:dvrv-eu-1:devo/36shCCBEAA:snap_in_package/787b97af-95a8-4b57-809e-8d55f4e72f40:snap_in_version/50d4660e-dad9-41D6-9169-8a7e96b2d7fa", + "sync_run": "cbbe2419-1f86-4737-aa78-6bb7118ce52c", + "sync_run_id": "cbbe2419-1f86-4737-aa78-6bb7118ce52c", + "sync_tier": "sync_tier_2", + "sync_unit": "don:integration:dvrv-eu-1:devo/36shCCBEAA:external_system_type/ADAAS:external_system/6752eb95c833e6b206fcf388:sync_unit/984c894e-71e5-4e94-b484-40b839c9a916", + "sync_unit_id": "984c894e-71e5-4e94-b484-40b839c9a916", + "uuid": "ff894fd5-2290-42bb-9f89-0785e49b4049", + "worker_data_url": "http://localhost:8003/external-worker" + }, + "event_type": "EXTRACTION_ATTACHMENTS_START" + }, + "context": { + "dev_oid": "don:identity:dvrv-eu-1:devo/36shCCBEAA", + "automation_id": "", + "source_id": "", + "snap_in_id": "don:integration:dvrv-eu-1:devo/36shCCBEAA:snap_in/04bf12fa-57bd-4057-b0b0-ed3f42d9813e", + "snap_in_version_id": "don:integration:dvrv-eu-1:devo/36shCCBEAA:snap_in_package/787b97af-95a8-4b57-809e-8d55f4e72f40:snap_in_version/50d4660e-dad9-41D6-9169-8a7e96b2d7fa", + "service_account_id": "don:identity:dvrv-eu-1:devo/36shCCBEAA:svcacc/101", + "secrets": { + "service_account_token": "test-service-account-token" + }, + "user_id": "don:identity:dvrv-eu-1:devo/36shCCBEAA:devu/6", + "event_id": "", + "execution_id": "4481432207487786275" + }, + "execution_metadata": { + "request_id": "ff894fd5-2290-42bb-9f89-0785e49b4049", + "function_name": "extraction", + "event_type": "EXTRACTION_ATTACHMENTS_START", + "devrev_endpoint": "http://localhost:8003" + }, + "input_data": { + "global_values": {}, + "event_sources": {}, + "keyrings": null, + "resources": { + "keyrings": {}, + "tags": {} + } + } + } +] \ No newline at end of file diff --git a/test_data/data_extraction_continue_test.json b/test_data/data_extraction_continue_test.json new file mode 100644 index 0000000..742fa13 --- /dev/null +++ b/test_data/data_extraction_continue_test.json @@ -0,0 +1,76 @@ + +[ + { + "payload": { + "connection_data": { + "key": "", + "key_type": "", + "org_id": "", + "org_name": "First Space" + }, + "event_context": { + "callback_url": "http://localhost:8002/callback", + "dev_oid": "test-dev-oid", + "dev_org": "test-dev-org", + "dev_org_id": "test-dev-org-id", + "dev_uid": "test-dev-uid", + "dev_user": "test-dev-user", + "dev_user_id": "test-dev-user-id", + "event_type_adaas": "", + "external_sync_unit": "test-external_sync_unit", + "external_sync_unit_id": "test-external_sync_unit_id", + "external_sync_unit_name": "test-external_sync_unit_name", + "external_system": "test-external_system", + "external_system_id": "test-external_system_id", + "external_system_name": "Wrike", + "external_system_type": "ADaaS", + "import_slug": "airdrop-wrike-snap-in", + "initial_sync_scope": "full-history", + "mode": "INITIAL", + "request_id": "test-request-id", + "request_id_adaas": "test-request-id-adaas", + "run_id": "test-run_id", + "sequence_version": "10", + "snap_in_slug": "wrike-snapin-devrev", + "snap_in_version_id": "don:integration:dvrv-eu-1:devo/test:snap_in_package/test:snap_in_version/test", + "sync_run": "test-sync_run", + "sync_run_id": "test-sync_run_id", + "sync_tier": "sync_tier_2", + "sync_unit": "don:integration:dvrv-eu-1:devo/test:external_system_type/ADAAS:external_system/test:sync_unit/test", + "sync_unit_id": "test-sync_unit_id", + "uuid": "test-uuid", + "worker_data_url": "http://localhost:8003/external-worker" + }, + "event_type": "EXTRACTION_DATA_CONTINUE" + }, + "context": { + "dev_oid": "don:identity:dvrv-eu-1:devo/test", + "automation_id": "", + "source_id": "", + "snap_in_id": "test-don:integration:dvrv-eu-1:devo/test:snap_in/test", + "snap_in_version_id": "test-don:integration:dvrv-eu-1:devo/test:snap_in_package/test:snap_in_version/test", + "service_account_id": "test-don:identity:dvrv-eu-1:devo/test:svcacc/74", + "secrets": { + "service_account_token": "test-service-account-token" + }, + "user_id": "don:identity:dvrv-eu-1:devo/test:devu/6", + "event_id": "", + "execution_id": "test-execution-id" + }, + "execution_metadata": { + "request_id": "test-request-id", + "function_name": "extraction", + "event_type": "EXTRACTION_DATA_CONTINUE", + "devrev_endpoint": "http://localhost:8003" + }, + "input_data": { + "global_values": {}, + "event_sources": {}, + "keyrings": null, + "resources": { + "keyrings": {}, + "tags": {} + } + } + } +] diff --git a/test_data/data_extraction_incremental_test.json b/test_data/data_extraction_incremental_test.json new file mode 100644 index 0000000..3663bd5 --- /dev/null +++ b/test_data/data_extraction_incremental_test.json @@ -0,0 +1,74 @@ +[ + { + "payload": { + "connection_data": { + "key": "", + "key_type": "", + "org_id": "", + "org_name": "First Space" + }, + "event_context": { + "callback_url": "http://localhost:8002/callback", + "dev_oid": "DEV-36shCCBEAA", + "dev_org": "DEV-36shCCBEAA", + "dev_org_id": "DEV-36shCCBEAA", + "dev_uid": "DEVU-1", + "dev_user": "DEVU-1", + "dev_user_id": "DEVU-1", + "event_type_adaas": "", + "external_sync_unit": "688725dad59c015ce052eecf", + "external_sync_unit_id": "688725dad59c015ce052eecf", + "external_sync_unit_name": "SaaS connectors", + "external_system": "6752eb95c833e6b206fcf388", + "external_system_id": "6752eb95c833e6b206fcf388", + "external_system_name": "Wrike", + "external_system_type": "ADaaS", + "import_slug": "wrike-snapin-devrev", + "mode": "INCREMENTAL", + "request_id": "63c6f1c6-eabe-452f-a694-7f23a8f5c3cc", + "request_id_adaas": "63c6f1c6-eabe-452f-a694-7f23a8f5c3cc", + "run_id": "cbbe2419-1f86-4737-aa78-6bb7118ce52c", + "sequence_version": "6", + "snap_in_slug": "wrike-snapin-devrev", + "snap_in_version_id": "don:integration:dvrv-eu-1:devo/36shCCBEAA:snap_in_package/b66dda95-cf9e-48be-918c-8439ecdd548e:snap_in_version/50d4660e-dad9-41d6-9169-8a7e96b2d7fa", + "sync_run": "cbbe2419-1f86-4737-aa78-6bb7118ce52c", + "sync_run_id": "cbbe2419-1f86-4737-aa78-6bb7118ce52c", + "sync_tier": "sync_tier_2", + "sync_unit": "don:integration:dvrv-eu-1:devo/36shCCBEAA:external_system_type/ADAAS:external_system/6752eb95c833e6b206fcf388:sync_unit/984c894e-71e5-4e94-b484-40b839c9a916", + "sync_unit_id": "984c894e-71e5-4e94-b484-40b839c9a916", + "uuid": "63c6f1c6-eabe-452f-a694-7f23a8f5c3cc", + "worker_data_url": "http://localhost:8003/external-worker" + }, + "event_type": "EXTRACTION_DATA_START" + }, + "context": { + "dev_oid": "don:identity:dvrv-eu-1:devo/36shCCBEAA", + "automation_id": "", + "source_id": "", + "snap_in_id": "don:integration:dvrv-eu-1:devo/36shCCBEAA:snap_in/03a783b1-5d9f-4af8-b958-e401f2022439", + "snap_in_version_id": "don:integration:dvrv-eu-1:devo/36shCCBEAA:snap_in_package/b66dda95-cf9e-48be-918c-8439ecdd548e:snap_in_version/50d4660e-dad9-41d6-9169-8a7e96b2d7fa", + "service_account_id": "don:identity:dvrv-eu-1:devo/36shCCBEAA:svcacc/42", + "secrets": { + "service_account_token": "test-service-account-token" + }, + "user_id": "don:identity:dvrv-eu-1:devo/36shCCBEAA:devu/1", + "event_id": "", + "execution_id": "13765595327067933408" + }, + "execution_metadata": { + "request_id": "63c6f1c6-eabe-452f-a694-7f23a8f5c3cc", + "function_name": "extraction", + "event_type": "EXTRACTION_DATA_START", + "devrev_endpoint": "http://localhost:8003" + }, + "input_data": { + "global_values": {}, + "event_sources": {}, + "keyrings": null, + "resources": { + "keyrings": {}, + "tags": {} + } + } + } +] \ No newline at end of file diff --git a/test_data/data_extraction_test.json b/test_data/data_extraction_test.json index a8d6429..aba29ab 100644 --- a/test_data/data_extraction_test.json +++ b/test_data/data_extraction_test.json @@ -2,10 +2,10 @@ { "payload": { "connection_data": { - "key": "test-key", + "key": "", "key_type": "", - "org_id": "test-org-id", - "org_name": "My Space" + "org_id": "", + "org_name": "First Space" }, "event_context": { "callback_url": "http://localhost:8002/callback", diff --git a/test_data/external_domain_metadata_event_payload.json b/test_data/external_domain_metadata_event_payload.json new file mode 100644 index 0000000..7a7489b --- /dev/null +++ b/test_data/external_domain_metadata_event_payload.json @@ -0,0 +1,35 @@ + +{ + "payload": { + "connection_data": { + "key": "", + "key_type": "", + "org_id": "", + "org_name": "First Space" + }, + "event_context": { + "callback_url": "http://localhost:8002/callback", + "external_sync_unit_id": "6752eb95c833e6b206fcf388" + } + }, + "context": { + "dev_oid": "test-org-id", + "source_id": "test-source-id", + "snap_in_id": "test-snap-in-id", + "snap_in_version_id": "test-snap-in-version-id", + "service_account_id": "test-service-account-id", + "secrets": { + "service_account_token": "test-token" + } + }, + "execution_metadata": { + "request_id": "63c6f1c6-eabe-452f-a694-7f23a8f5c3cc", + "function_name": "get_external_domain_metadata", + "event_type": "test-event", + "devrev_endpoint": "http://localhost:8003" + }, + "input_data": { + "global_values": {}, + "event_sources": {} + } + } diff --git a/test_data/external_sync_unit_check.json b/test_data/external_sync_unit_check.json index e7fe5e6..c0a68a6 100644 --- a/test_data/external_sync_unit_check.json +++ b/test_data/external_sync_unit_check.json @@ -2,10 +2,10 @@ { "payload": { "connection_data": { - "key": "test-key", + "key": "", "key_type": "", - "org_id": "org-id", - "org_name": "Personal" + "org_id": "", + "org_name": "First Space" }, "event_context": { "callback_url": "http://localhost:8002/callback", diff --git a/wrike_postman.json b/wrike_postman.json index cf0ef2f..106b388 100644 --- a/wrike_postman.json +++ b/wrike_postman.json @@ -9,9 +9,9 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/contacts?deleted=false&fields=[metadata,currentBillRate,currentCostRate,jobRoleId]", + "raw": "https://www.wrike.com/api/v4/contacts?deleted=false&types=[Person]", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "contacts" @@ -22,8 +22,8 @@ "value": "false" }, { - "key": "fields", - "value": "[metadata,currentBillRate,currentCostRate,jobRoleId]" + "key": "types", + "value": "[Person]" } ] } @@ -34,13 +34,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/contacts/KUAFY3BJ", + "raw": "https://www.wrike.com/api/v4/contacts/{contactId}", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "contacts", - "KUAFY3BJ" + "{contactId}" ], "query": [ { @@ -54,7 +54,7 @@ "name": "Get Information about specific contact", "originalRequest": { "method": "GET", - "url": "{{WrikeAPI}}/contacts/KUANFJBJ,NVJKSNJK" + "url": "https://www.wrike.com/api/v4/contacts/{contactId1},{contactId2}" }, "status": "OK", "code": 200, @@ -71,13 +71,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/folders/IEACW7SVI4PX3YZS?fields=[briefDescription,customColumnIds,attachmentCount,contractType]", + "raw": "https://www.wrike.com/api/v4/folders/{folderId}?fields=[briefDescription,customColumnIds,attachmentCount,contractType]", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "folders", - "IEACW7SVI4PX3YZS" + "{folderId}" ], "query": [ { @@ -96,13 +96,14 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/spaces/{spaceId}/folders?project=true", + "raw": "https://www.wrike.com/api/v4/spaces/{spaceId}/folders", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ - "folders", - "IEACW7SVI4PX3YZS" + "spaces", + "{spaceId}", + "folders" ] } }, @@ -110,7 +111,7 @@ "name": "Get Projects in a specific Space", "originalRequest": { "method": "GET", - "url": "{{WrikeAPI}}/spaces/{spaceId}/folders?project=true" + "url": "https://www.wrike.com/api/v4/spaces/{spaceId}/folders" }, "status": "OK", "code": 200, @@ -127,13 +128,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/tasks/IEACW7SVKRAJXB7L?fields=[customItemTypeId,finance,billingType,effortAllocation,responsiblePlaceholderIds,attachmentCount,recurrent]", + "raw": "https://www.wrike.com/api/v4/tasks/{taskId}?fields=[customItemTypeId,finance,billingType,effortAllocation,responsiblePlaceholderIds,attachmentCount,recurrent]", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "tasks", - "IEACW7SVKRAJXB7L" + "{taskId}" ], "query": [ { @@ -149,13 +150,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/tasks/IEACW7SVKQZEBEUN,IEACW7SVKQPX4WHN?fields=[recurrent,attachmentCount,effortAllocation,billingType]", + "raw": "https://www.wrike.com/api/v4/tasks/{taskId1},{taskId2}?fields=[recurrent,attachmentCount,effortAllocation,billingType]", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "tasks", - "IEACW7SVKQZEBEUN,IEACW7SVKQPX4WHN" + "{taskId1},{taskId2}" ], "query": [ { @@ -171,13 +172,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/spaces/IEACW7SVI4O6BDQE/tasks?descendants=true&status=Active&importance=Normal&type=Planned&fields=[recurrent,attachmentCount,effortAllocation,billingType]&dueDate={\"start\":\"2020-07-01\",\"end\":\"2020-07-07\"}", + "raw": "https://www.wrike.com/api/v4/spaces/{spaceId}/tasks?descendants=true&status=Active&importance=Normal&type=Planned&fields=[recurrent,attachmentCount,effortAllocation,billingType]&dueDate={\"start\":\"2020-07-01\",\"end\":\"2020-07-07\"}", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "spaces", - "IEACW7SVI4O6BDQE", + "{spaceId}", "tasks" ], "query": [ @@ -204,6 +205,10 @@ { "key": "dueDate", "value": "{\"start\":\"2020-07-01\",\"end\":\"2020-07-07\"}" + }, + { + "key": "updatedDate", + "value": "{\"start\":\"2025-08-20T00:00:00Z\"}" } ] } @@ -214,13 +219,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/folders/IEACW7SVI4OMYFIY/tasks?descendants=true&status=Active&importance=Normal&type=Planned&fields=[recurrent,attachmentCount,effortAllocation,billingType]&dueDate={\"start\":\"2020-07-01\",\"end\":\"2020-07-07\"}&pageSize=200", + "raw": "https://www.wrike.com/api/v4/folders/{folderId}/tasks?descendants=true&status=Active&importance=Normal&type=Planned&fields=[responsibleIds]&dueDate={\"start\":\"2020-07-01\",\"end\":\"2020-07-07\"}&pageSize=200", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "folders", - "IEACW7SVI4OMYFIY", + "{folderId}", "tasks" ], "query": [ @@ -242,7 +247,7 @@ }, { "key": "fields", - "value": "[recurrent,attachmentCount,effortAllocation,billingType]" + "value": "[responsibleIds]" }, { "key": "dueDate", @@ -251,6 +256,14 @@ { "key": "pageSize", "value": "200" + }, + { + "key": "nextPageToken", + "value": "AFGM35QAAAAAUAAAAAAQAAAABIAAAAAB4FVYIMRO4RBAE" + }, + { + "key": "updatedDate", + "value": "{\"start\":\"2025-08-20T00:00:00Z\"}" } ] } @@ -261,13 +274,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/folders/IEACW7SVI4OMYFIY/tasks?descendants=true&subTasks=true", + "raw": "https://www.wrike.com/api/v4/folders/{folderId}/tasks?descendants=true&subTasks=true", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "folders", - "IEACW7SVI4OMYFIY", + "{folderId}", "tasks" ], "query": [ @@ -293,9 +306,9 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/attachments?versions=true&createdDate={\"start\":\"2020-07-01T00:00:00Z\",\"end\":\"2020-07-02T07:53:33Z\"}&withUrls=true", + "raw": "https://www.wrike.com/api/v4/attachments?versions=true&createdDate={\"start\":\"2020-07-01T00:00:00Z\",\"end\":\"2020-07-02T07:53:33Z\"}&withUrls=true", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "attachments" @@ -322,13 +335,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/folders/IEACW7SVI4PZXTGO/attachments", + "raw": "https://www.wrike.com/api/v4/folders/{folderId}/attachments", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "folders", - "IEACW7SVI4PZXTGO", + "{folderId}", "attachments" ] } @@ -339,16 +352,32 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/tasks/IEACW7SVKQOKD5EG/attachments", + "raw": "https://www.wrike.com/api/v4/tasks/{taskId}/attachments", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "tasks", - "IEACW7SVKQOKD5EG", + "{taskId}", "attachments" + ], + "query": [ + { + "key": "withUrls", + "value": "true" + } ] } + }, + "response": { + "name": "Get Attachments on Task", + "originalRequest": { + "method": "GET", + "url": "https://www.wrike.com/api/v4/tasks/{taskId}/attachments" + }, + "status": "OK", + "code": 200, + "body": "{\"kind\":\"attachments\",\"data\":[{\"id\":\"IEACW7SVIYEV4HBN\",\"authorId\":\"IEAGS6BY\",\"name\":\"Result from test.com\",\"createdDate\":\"2025-07-25T07:53:33Z\",\"version\":\"1\",\"size\":1024,\"type\":\"application/vnd.openxmlformats-officedocument.wordprocessingml.document\",\"url\":\"https://www.wrike.com/attachments/IEACW7SVIYEV4HBN/download/Lorem Ipsum.docx\",\"taskId\":\"IEACW7SVKQOKD5EG\",\"width\":100,\"height\":100}]}" } }, { @@ -356,13 +385,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/attachments/IEACW7SVIYEV4HBN", + "raw": "https://www.wrike.com/api/v4/attachments/{attachmentId}", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "attachments", - "IEACW7SVIYEV4HBN" + "{attachmentId}" ] } } @@ -372,13 +401,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/attachments/IEACW7SVIYJJEUHD/download/Lorem Ipsum.docx", + "raw": "https://www.wrike.com/api/v4/attachments/{attachmentId}/download/Lorem Ipsum.docx", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "attachments", - "IEACW7SVIYJJEUHD", + "{attachmentId}", "download", "Lorem Ipsum.docx" ] @@ -390,13 +419,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/attachments/IEACW7SVIYJJEUHD/url", + "raw": "https://www.wrike.com/api/v4/attachments/{attachmentId}/url", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "attachments", - "IEACW7SVIYJJEUHD", + "{attachmentId}", "url" ] } @@ -412,9 +441,9 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/spaces?withArchived=false&userIsMember=false", + "raw": "https://www.wrike.com/api/v4/spaces?withArchived=false&userIsMember=false", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "spaces" @@ -442,13 +471,13 @@ "request": { "method": "GET", "url": { - "raw": "{{WrikeAPI}}/spaces/IEACW7SVI4XDCUZX", + "raw": "https://www.wrike.com/api/v4/spaces/{spaceId}", "host": [ - "{{WrikeAPI}}" + "https://www.wrike.com/api/v4" ], "path": [ "spaces", - "IEACW7SVI4XDCUZX" + "{spaceId}" ], "query": [ { @@ -462,7 +491,7 @@ "name": "Get Space by ID", "originalRequest": { "method": "GET", - "url": "{{WrikeAPI}}/spaces/IEACW7SVI4XDCUZX?fields=[members]" + "url": "https://www.wrike.com/api/v4/spaces/{spaceId}?fields=[members]" }, "status": "OK", "code": 200, @@ -481,5 +510,17 @@ "type": "string" } ] + }, + "429_response": { + "status": 429, + "reason": "Too Many Requests", + "method": "GET", + "headers": { + "retry-after": "49" + }, + "body_json": { + "errorDescription": "Rate limit exceeded, try again later", + "error": "rate_limit_exceeded" + } } } \ No newline at end of file