Skip to content

Commit

Permalink
RC-1.2.0 (#25)
Browse files Browse the repository at this point in the history
* #OBS-I116: Dataset update API Dedupe and denorm test cases fixes

* #OBS-I116: Dataset Create api test case fixes

* #OBS-I116: Dataset update extraction config api test case fixes

* #OBS-I116: Dataset update api test cases

* #OBS-I116: fix: linting fixes

* #OBS-I116: lint fixes

* #OBS-I116: Dataset status transition test cases

* #OBS-I116: feat: Test cases and linting fixes

* #OBS-I116: feat: Dataset status transition test cases fix

* #OBS-I141: added a new metric to sum the response time

* #OBS-I141: modified the url variable and access dataset_id from params

* #OBS-I141: added helper function to get dataset_id for error cases

* #OBS-I141: added telemetry for v2 api's

* #OBS-I141: added a new metric to sum the response time

* #OBS-I141: modified the url variable and access dataset_id from params

* #OBS-I141: added helper function to get dataset_id for error cases

* #OBS-I141: added telemetry for v2 api's

* #OBS-I141: added telemetry for v2 api's

* #OBS-I143: feat: dataset publish changes to deploy flink connectors

* #OBS-I141: removed metric for sum of response time

* #OBS-I141: removed usage of builtin kafka methods from telemetry file

* #OBS-I146: feat: Retire fix

* Issue #SBCOSS-12 fix: convert all SQL raw queries to prepared statements

* Issue #SBCOSS-12 fix: tags is an array, so requires empty json for null case; dataset draft deletion requires deletion of transformation and source config drafts

* #SBCOSS-23: feat:  dataset publish changes for redeployment

* #OBS-I167 : read api changes while reading connectors according to v2 structure

* #OBS-I173: fix: Ready to publish schema fix to expect connector configs as object and string

* #OBS-I174: fix: Dataset read api fix to expect both v1 and v2 connectors

* #OBS-I146: fix: Test case fix for read api

* #OBS-I146: fix: Test case fix for read api changes

* #OBS-I146: fix: status transition test cases

* #OBS-I146: fix: Test case script fix

* #OBS-I146: fix: Type error fix

* #OBS-I141: removed metric for sum of response time

* #OBS-I146: fix: Dataset read api test cases fixes

* #OBS-I146: fix: Hudi spec generation test cases

* #OBS-I146: fix: Test case and linting fix

* merge commit

* #OBS-I173: fix: Dataset web console required fixes

* #OBS-I173: fix: Dataset update changes to accept type changes

* #OBS-I167 : dataset read api changes to read live dataset source configs

* #OBS-I146: fix: linting fix

* #OBS-I146: fix: linting fix

* #OBS-I167 : Added string or dict as type to connector_config

* #OBS-I143: dataset publish changes fixes

* #OBS-I167 : if dataset is empty return with error

* #OBS-I143: inswert query fix

* Issue #OBS-I144 fix: icon data as string; check default version

* #OBS-143: fix: dataset publish fixes

* #OBS-I167 : fix: removed duplicate code.

* #OBS-I181 - Updated the event structure

* #OBS-I164: added jwt token vwerification and access control to api's

* #OBS-I164: added jwt token vwerification and access control to api's

* #OBS-I164: modified the access roles and permissions

* #OBS-I164: reading public key from env file

* #OBS-I186 : fix: dataset metrics api

* #OBS-I185 : fix: removed duplicate code.

* #OBS-I164: modified public key variable in config

* flink connector helm chart updates

* flink connector helm chart updates

* fix: dataset publish fixes

* #OBS-I164: modified public key variable in config

* #OBS-I186 : added dataset mertric api controller and route and minor change in dataset transition api

* install pip requirments if applicable

* #OBS-I186 : removed export statement

* #OBS-I164: added config for option rbac verification

* #OBS-I164: changed the middleware to rbac_middleware

* #OBS-I186 : Logic moved to separate function

* #OBS-I164: changed import name

* #OBS-I186 : Logic moved to separate function

* #OBS-I185 : fix: test case fixes

* #OBS-I185 : fix: linting fix

* #OBS-I164: modified config and rbac middleware

* #OBS-I164: added jsonwebtoken package

* V2 apis (#240)

* #OBS-I115: Dataset list API refactoring

* #0000: adding command api

* #OBS-I116: Dataset CRUD APIs test and fixes

* #OBS-I115: cmd api remove addn modules

* #OBS-I116: Dataset status trasition to retire check for denorm fields

* #OBS-I116: Dataset CRUD APIs test cases and fixes

* #OBS-I115: Dataset Transition API refactoring

* #OBS-I115: Dataset Transition API refactoring

* #OBS-I116: Dataset update API Dedupe and denorm test cases fixes

* #OBS-I115: Dataset Transition API refactoring and error handling refactoring

* #OBS-I115: Dataset Transition API refactoring

* #OBS-I115: Dataset Transition API refactoring

* #OBS-I115: Remove unnecessary field fields_set

* #OBS-I115: Dataset publish API - update the index of hudi spec properly for publish to handle schema evolution

* #OBS-I116: Dataset create and status transition api code fix

* #OBS-I116: Dataset Create api test case fixes

* #OBS-I116: Dataset update extraction config api test case fixes

* #OBS-I116: Dataset druid ingestion spec generation fix

* #OBS-I116: express version upgraded

* #OBS-I116: Dataset create api fixes

* #OBS-I116: Dataset ingestion spec generation fix

* #OBS-I126: updated swagger documentation

* #OBS-I126: updated postman collection

* #OBS-I126: updated postman collection

* #OBS-I116: fix: entry topic column in datasets model

* #OBS-I58 feat: Minio cloud store support

- Added endpoint. as optional config to support the minio

* #OBS-I126: added dataset read, list, update api's documentation and updated collection

* #OBS-I116: feat: Dataschema api implementation v2

* #OBS-I116: feat: Schema validation fix

* #OBS-I126: swagger doc updated

* #OBS-I116: Dataset update api test cases

* #OBS-I116: fix: linting fixes

* #OBS-I116: lint fixes

* #OBS-I116: Dataset status transition test cases

* #OBS-I126 : added multiple requests example

* #OBS-I126 : updated order

* #OBS-I126 : updated server url

* #OBS-I21: feat: dataset publish changes for connectors

* #OBS-I1 updated Dataset Health API code

* #OBS-I116: Dataset CRUD api fixes

* #OBS-I116: fix: error codes fix

* #OBS-I1 Refactored as per new changes

* #OBS-I101: Update the publish API for the v2 APIs

* #OBS-I101: fix db models

* #OBS-I115: Remove the v1 unused API code and restructure the folders

* #OBS-I1 Added DatasethealthService

* #OBS-I1 Updated the imports and folders

* #OBS-I2 updated dataset reset

* #OBS-I116: fix: Command api and schema fixes

* #OBS-I116: fix: Command api fix in db query

* #OBS-I1 Added Notifications and alerts APIs

* #OBS-I138: added decrypted response for the read api for connectors_config field and added defaults updated date and created date to list api

* #OBS-I116: fix: feat: Dataset copy and export api implementation

* #OBS-I2 Refactoring as per v2 APIs

* #OBS-I2 typo fix

* #OBS-I116: fix: fix: Dataset copy check for dataset fix

* #OBS-I21: dataset publish changes fixes

* #OBS-I116: fix: feat: Feedback fixes of removing set redis db

* #OBS-I138: added cors to app

* #OBS-I116: fix: feat: Dataset import api implementation

* #OBS-I116: fix: fix: Dataset service fix

* #OBS-I116: fix: repeated Validation method removal

* #OBS-I116: fix: unused code

* connector list

* #OBS-I142: connector list api

* #OBS-I116: fix: schema validation check for v1 exported dataset.

* #OBS-I116: fix: Dataset overwrite after creation failure

* #OBS-I116: fix: error handling

* #OBS-I116: fix: error messages fix

* #OBS-I116: fix: code fixes

* #OBS-I142: formatted connector list api files

* #OBS-I142: formatted connector list api files

* #OBS-I142: formatted connector list file

* #OBS-I138: required changes for dataset for master dataset migration from v1 to v2

* #OBS-I138: removed cors package

* #OBS-I138: updated package json file

* #OBS-I138: indentation fix

* #OBS-I138: indentation fix

* #OBS-I138: fixed indentations

* #OBS-I142: updated postman collection

* #OBS-I142: added connector list swagger documentation

* #OBS-I142: updated postman collection

* #OBS-I116: feat: Dataset Import and export api integration fixes

* #OBS-I138: removed comment

* #OBS-I138: throwing error if dataset is undefined

* #OBS-I142: added test cases for connector list

* #OBS-I138: merging dataset defaults to dataset draft record before saving

* #OBS-I138: adding merged event to dataset while migrating live or draft dataset

* #OBS-I142: added live_date field to defaultFields

* #OBS-I145: Connector Read API

* #OBS-I145: updated postman collection with connector read api

* #OBS-I145: updated swagger documentation

* #OBS-I145: updated the connector read api

* merge changes

* merge changes

* Resolved merge changes

* Resolved merge changes

* #OBS-I145: updated the connector read api

* #OBS-I138: removed datakey before merging defaults to dataset

* #OBS-I145: updated postman collection and swagger documentation

* #OBS-I145: added the test cases for connector read api

* #OBS-I145: added a test case for connector read api

* #OBS-I116: feat: Dataset import api fixes

* #OBS-I116: feat: Test cases and linting fixes

* #OBS-I116: feat: Dataset status transition test cases fix

* #OBS-I1 updated the routes

* #OBS-I116: feat: Dataset migratio method fix

* #OBS-I108: feat: helm modifications for flink connectors

* #OBS-I108: helm chart fixes

* #OBS-I108: feat: Modify volume mounts

* #OBS-I108: feat: Add PVC for JobManager

* #OBS-I108: feat: change args for jobmanager command

* #OBS-I141: added a new metric to sum the response time

* #OBS-I141: modified the url variable and access dataset_id from params

* #OBS-I141: added helper function to get dataset_id for error cases

* #OBS-I108: feat: Use sidecar container to submit connector flink job

* #OBS-I141: added telemetry for v2 api's

* #OBS-I141: added a new metric to sum the response time

* #OBS-I141: modified the url variable and access dataset_id from params

* #OBS-I141: added helper function to get dataset_id for error cases

* #OBS-I141: added telemetry for v2 api's

* #OBS-I141: added telemetry for v2 api's

* #OBS-I143: feat: dataset publish changes to deploy flink connectors

* #OBS-I141: removed metric for sum of response time

* #OBS-I141: removed usage of builtin kafka methods from telemetry file

* #OBS-I146: feat: Retire fix

* Issue #SBCOSS-12 fix: convert all SQL raw queries to prepared statements

* Issue #SBCOSS-12 fix: tags is an array, so requires empty json for null case; dataset draft deletion requires deletion of transformation and source config drafts

* #SBCOSS-23: feat:  dataset publish changes for redeployment

* #OBS-I167 : read api changes while reading connectors according to v2 structure

* #OBS-I173: fix: Ready to publish schema fix to expect connector configs as object and string

* #OBS-I174: fix: Dataset read api fix to expect both v1 and v2 connectors

* #OBS-I146: fix: Test case fix for read api

* #OBS-I146: fix: Test case fix for read api changes

* #OBS-I146: fix: status transition test cases

* #OBS-I146: fix: Test case script fix

* #OBS-I146: fix: Type error fix

* #OBS-I141: removed metric for sum of response time

* #OBS-I146: fix: Dataset read api test cases fixes

* #OBS-I146: fix: Hudi spec generation test cases

* #OBS-I146: fix: Test case and linting fix

* merge commit

* #OBS-I173: fix: Dataset web console required fixes

* #OBS-I173: fix: Dataset update changes to accept type changes

* #OBS-I167 : dataset read api changes to read live dataset source configs

* #OBS-I146: fix: linting fix

* #OBS-I146: fix: linting fix

* #OBS-I167 : Added string or dict as type to connector_config

* #OBS-I143: dataset publish changes fixes

* #OBS-I167 : if dataset is empty return with error

* #OBS-I143: inswert query fix

* Issue #OBS-I144 fix: icon data as string; check default version

* #OBS-143: fix: dataset publish fixes

* #OBS-I167 : fix: removed duplicate code.

* #OBS-I181 - Updated the event structure

* #OBS-I164: added jwt token vwerification and access control to api's

* #OBS-I164: added jwt token vwerification and access control to api's

* #OBS-I164: modified the access roles and permissions

* #OBS-I164: reading public key from env file

* #OBS-I186 : fix: dataset metrics api

* #OBS-I185 : fix: removed duplicate code.

* #OBS-I164: modified public key variable in config

* flink connector helm chart updates

* flink connector helm chart updates

* fix: dataset publish fixes

* #OBS-I164: modified public key variable in config

* #OBS-I186 : added dataset mertric api controller and route and minor change in dataset transition api

* install pip requirments if applicable

* #OBS-I186 : removed export statement

* #OBS-I164: added config for option rbac verification

* #OBS-I164: changed the middleware to rbac_middleware

* #OBS-I186 : Logic moved to separate function

* #OBS-I164: changed import name

* #OBS-I186 : Logic moved to separate function

* #OBS-I185 : fix: test case fixes

* #OBS-I185 : fix: linting fix

* #OBS-I164: modified config and rbac middleware

* #OBS-I164: added jsonwebtoken package

* master url fix

---------

Co-authored-by: Santhosh Vasabhaktula <santhosh@sanketika.in>
Co-authored-by: JeraldJF <jeraldj@sanketika.in>
Co-authored-by: yashashk <yashashk@sanketika.in>
Co-authored-by: harishkumar gangula <harish@sanketika.in>
Co-authored-by: SurabhiAngadi <surabhia@sanketika.in>
Co-authored-by: Rakshitha-D <rakshithagowda.d@gmail.com>
Co-authored-by: Anand Parthasarathy <anandp504@gmail.com>
Co-authored-by: Aniket Sakinala <aniket@sanketika.in>

* Resolved test case issue

* Merge conflicts changes

* #OBS-I165: added userInfo from token to request object

* #OBS-I165: updated telemetry to use user role

* #OBS-I165: updated datasetCreate api to add userRole as created_by

* #OBS-I165: added userRole when migrating and create table from live

* #OBS-I165: added userRole for data copy api

* #OBS-I165: added userRole for dataset Update

* #OBS-I165: added userRole for dataset Import

* #OBS-I165: added userRole for dataset status transition

* #OBS-I165: modified Dataset Service to update the userRoles

* #OBS-I165: added permission for queryTemplateUpdate api

* #OBS-I165: added userRole for query template create and update

* #OBS-I165: added userRole for alerts api

* #OBS-I165: added userRole for notifications api

* #OBS-I165: added userRole for silences api

* #OBS-I86 : commented routes have regex

* #OBS-I77 : query api changes to check datasource avialability v2

* #OBS-I77 : fix: test cases

* #OBS-I165: corrections userRole access in notification

* #OBS-I165: added operations_admin role and updated permissions

* #OBS-I165: added rbac middleware for alert routers

* #OBS-I165: modified the user permissions into a json object

* #OBS-I165: removed unused code

* #OBS-I203: Remove the archived and purged status transition from the API

* #OBS-I165: added user permissions json

* #OBS-I165: removed user roles for type

* #OBS-I165: added userID to req object and importing  permissions from jsonfile

* #OBS-I165: handled rbac disabled scenario and updated userID instead of userRole

* #OBS-I165: changed the userRole to userID

* #OBS-I165: added createdby for dataset publish api

* #OBS-I165: added error condition and modified status code

* #OBS-I165: removed redundant code

* #OBS-I222 : Resolved regexp route issue in express version 5

* #OBS-I77 : query api changes to check avialability of datasource

* #OBS-I77 : fix: spell fix

* #OBS-I77 : granularity spell changes

* #OBS-I77 : fix: test scenarios added

* #OBS-I131 : fix: stop deleting draft records if submission tasks fail

* #OBS-I131 : fix: upsert if republished

* #OBS-I131 : fix: test case fixes

* #OBS-I165: added default userID, modified error codes, added errorhandler function

* #OBS-I165: added updatedby by default when an entry created

* #OBS-I165: removed redundant code

* #OBS-I165: modified errorhandler

* #OBS-I165: modified error message

* #OBS-I227 : fix: threshold of dataset level alerts inside an array.

* #OBS-I179 : feat: retire dataset related alerts and metrics on dataset retire

* #OBS-I79: feat: test case fixes

* #OBS-I79: feat: Alerts get fix

* feat: command tag name change (#252)

* feat: command tag name change (#252) (#254)

* #0000 fix: ingestion spec cache fix

* Ingestion spec fix (#255)

* feat: command tag name change (#252) (#254)

* #0000 fix: ingestion spec cache fix

---------

Co-authored-by: Ravi Mula <ravismula@users.noreply.github.com>

* #OBS-I230: fix: kafka command

* #OBS-I230: fix: cache issues changes

* #OBS-I230: fix: redis db number change on update

* #OBS-I187: added model for user

* #OBS-I187: added userService

* #OBS-I187: modified to verify keycloak token

* #OBS-I187: updated rbac middleware

* #OBS-I187: add seperate function to check access

* #OBS-I230: fix: integration fixes

* #OBS-I230: fix: kafka command fixes (#258)

* #OBS-I230: fix: kafka command fixes

* #OBS-I247: fix: feedback changes.

---------

Co-authored-by: Ravi Mula <ravismula@users.noreply.github.com>

* #OBS-I218 update kafka connector image tags

* #OBS-I288: fix: removed data_format field from connector_instance (#260)

* #OBS-I285: query api fix to validate and set limit to sql queries

* #I285: removed empty objects and arrays from sample data (#261)

* #I285: removed empty objects and arrays from sample data

* Fix code scanning alert no. 98: Loop bound injection

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>

* #I285: lint issues fixed

---------

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>

* #OBS-I285: query api to parse sql query first then do regex check with postman

* #OBS-I285: fix: linting issue fix

* #OBS-I285: fix: lint fix

* #OBS-I285: fix: Logging errors in the middleware (#263)

* #OBS-I285: fix: Logging errors in the middleware

* #OBS-I285: fix: Logging errors by excluding sensitive info

* #OBS-I285: fix: linting issue fix

* #OBS-I289 : added route (#265)

* fix: Stop the connectors specific to the dataset. (#266)

* fix: uninstall dataset specific spark jobs

* fix: remove nested loops

* fix: uninstall dataset specific spark jobs

* fix: remove nested loops

* #OBS-I289 fix: Fix flink connector deployments

* #OBS-I289 fix: Fix Spark Connector deployments

* #OBS-I307 - revert the code as this is fixed from the front end (#268)

* #OBS-I330 : changed type from hudi to datalake

* fix for object-store-connector cron (#269)

* #OBS-I330 : fixed merging issue

* #OBS-I330 : removed comment

* #OBS-I330 : Using dataset v2 export api while publishing

* #OBS-I330 : removed print statement

* #OBS-I330 : Using dataset v2 export api while publishing (#271)

* #OBS-I330 : Using dataset v2 export api while publishing

* #OBS-I330 : removed print statement

* #OBS-I330 : Replacing - with _ for datasource_ref and adding partition key and primary key and timestamp key in column spec of ingestion spec

* Hudi fixes (#272)

* #OBS-I330 : Using dataset v2 export api while publishing

* #OBS-I330 : removed print statement

* #OBS-I330 : Replacing - with _ for datasource_ref and adding partition key and primary key and timestamp key in column spec of ingestion spec

* fix: update connector instance id when inserting to db

* #OBS-I330 : removed merging defaults to keys_config

* #OBS-I330 : omit merging defaults to draft dataset for keys_config

* #OBS-I335: hudi spec fix

* #OBS-I335: hudi spec fix

* #OBS-I335: linting fix

* #OBS-I334 - clear transformations on re-upload

* #OBS-I334 - clear keys_config on re-upload of schema file

* #OBS-I335: dataset update fix

* #OBS-I335: loop bound issue fix

* #OBS-I335: dataset updated as per feedbacks

* append base64 prefix upon connector register

* #OBS-I334 - Fix the schema update functionality

* Data mapping fix (#278)

* #OBS-I335: hudi spec fix (#279)

---------

Co-authored-by: JeraldJF <jeraldj@sanketika.in>
Co-authored-by: Ravi Mula <ravismula@users.noreply.github.com>
Co-authored-by: Rakshitha-D <rakshithagowda.d@gmail.com>
Co-authored-by: SurabhiAngadi <surabhia@sanketika.in>
Co-authored-by: Aniket Sakinala <aniket@sanketika.in>
Co-authored-by: yashashk <yashashk@sanketika.in>
Co-authored-by: Santhosh Vasabhaktula <santhosh@sanketika.in>
Co-authored-by: Anand Parthasarathy <anandp504@gmail.com>
Co-authored-by: Jerald <127138957+JeraldJF@users.noreply.github.com>
Co-authored-by: Rakshitha-D <115482806+Rakshitha-D@users.noreply.github.com>
Co-authored-by: SurabhiAngadi <138881390+SurabhiAngadi@users.noreply.github.com>
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Co-authored-by: yashash <126703764+yashashkumar@users.noreply.github.com>
  • Loading branch information
14 people authored Nov 13, 2024
1 parent b9174ea commit 6591f06
Show file tree
Hide file tree
Showing 79 changed files with 5,042 additions and 3,092 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/build_and_deploy.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ jobs:
context: ./command-service
platforms: linux/amd64
push: true
tags: ${{ vars.DOCKERHUB_USERNAME }}/flink-command-service:${{ github.ref_name }}
tags: ${{ vars.DOCKERHUB_USERNAME }}/obsrv-command-service:${{ github.ref_name }}

aws-deploy:
needs: [build-api-service-image, build-command-service-image]
Expand Down
4 changes: 4 additions & 0 deletions api-service/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -28,11 +28,13 @@
"aws-sdk": "^2.1348.0",
"axios": "^1.6.0",
"body-parser": "^1.20.2",
"busboy": "^1.6.0",
"compression": "^1.7.4",
"dateformat": "2.0.0",
"express": "^5.0.0-beta.3",
"http-errors": "^2.0.0",
"http-status": "^1.5.3",
"jsonwebtoken": "^9.0.1",
"kafka-node": "^5.0.0",
"kafkajs": "^2.2.4",
"kafkajs-snappy": "^1.1.0",
Expand All @@ -59,12 +61,14 @@
"@babel/traverse": "7.23.2"
},
"devDependencies": {
"@types/busboy": "^1.5.4",
"@types/chai": "^4.3.3",
"@types/chai-as-promised": "^7.1.5",
"@types/chai-spies": "^1.0.3",
"@types/compression": "^1.7.2",
"@types/express": "^4.17.14",
"@types/http-errors": "^2.0.1",
"@types/jsonwebtoken": "^9.0.6",
"@types/kafkajs": "^1.9.0",
"@types/knex": "^0.16.1",
"@types/lodash": "^4.14.190",
Expand Down
1,882 changes: 0 additions & 1,882 deletions api-service/postman-collection/Obsrv API Service.postman_collection_v2.json

This file was deleted.

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion api-service/src/app.ts
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ app.use("/v2/", v2Router);
app.use("/", druidProxyRouter);
app.use("/alerts/v1", alertsRouter);
app.use("/", metricRouter);
app.use("*", ResponseHandler.routeNotFound);
app.use(/(.*)/, ResponseHandler.routeNotFound);
app.use(obsrvErrorHandler);

app.listen(config.api_port, () => {
Expand Down
4 changes: 3 additions & 1 deletion api-service/src/configs/Config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -112,5 +112,7 @@ export const config = {
"dialect": process.env.dialet || "postgres",
"url": process.env.grafana_url || "http://localhost:8000",
"access_token": process.env.grafana_token || ""
}
},
"user_token_public_key": process.env.user_token_public_key || "",
"is_RBAC_enabled": process.env.is_rbac_enabled || "false",
}
29 changes: 29 additions & 0 deletions api-service/src/configs/IngestionConfig.ts
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,13 @@ export const rawIngestionSpecDefaults = {
"type": "text",
"expr": "$.obsrv_meta.syncts"
},
"hudiSynctsField": {
"name": "obsrv_meta_syncts",
"arrival_format": "text",
"data_type": "date",
"type": "text",
"expr": "$.obsrv_meta.syncts"
},
"dimensions": [
{
"type": "string",
Expand All @@ -64,6 +71,16 @@ export const rawIngestionSpecDefaults = {
"name": "obsrv.meta.source.id"
}
],
"hudi_dimensions": [
{
"type": "string",
"name": "obsrv_meta_source_connector"
},
{
"type": "string",
"name": "obsrv_meta_source_id"
}
],
"flattenSpec": [
{
"type": "path",
Expand All @@ -75,5 +92,17 @@ export const rawIngestionSpecDefaults = {
"expr": "$.obsrv_meta.source.connectorInstance",
"name": "obsrv.meta.source.id"
}
],
"hudi_flattenSpec": [
{
"type": "path",
"expr": "$.obsrv_meta.source.connector",
"name": "obsrv_meta_source_connector"
},
{
"type": "path",
"expr": "$.obsrv_meta.source.connectorInstance",
"name": "obsrv_meta_source_id"
}
]
}
4 changes: 4 additions & 0 deletions api-service/src/connections/commandServiceConnection.ts
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,8 @@ export const executeCommand = async (id: string, command: string) => {
}
}
return commandHttpService.post(commandPath, payload)
}

export const registerConnector = async (requestBody: any) => {
return commandHttpService.post("/connector/v1/register", requestBody)
}
16 changes: 13 additions & 3 deletions api-service/src/controllers/Alerts/Alerts.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@ const telemetryObject = { type: "alert", ver: "1.0.0" };
const createAlertHandler = async (req: Request, res: Response, next: NextFunction) => {
try {
const alertPayload = getAlertPayload(req.body);
const userID = (req as any)?.userID;
_.set(alertPayload, "created_by", userID);
_.set(alertPayload, "updated_by", userID);
const response = await Alert.create(alertPayload);
updateTelemetryAuditEvent({ request: req, object: { id: response?.dataValues?.id, ...telemetryObject } });
ResponseHandler.successResponse(req, res, { status: httpStatus.OK, data: { id: response.dataValues.id } });
Expand All @@ -28,8 +31,11 @@ const createAlertHandler = async (req: Request, res: Response, next: NextFunctio
const publishAlertHandler = async (req: Request, res: Response, next: NextFunction) => {
try {
const { alertId } = req.params;
const rulePayload: Record<string, any> | null = await getAlertRule(alertId);
if (!rulePayload) return next({ message: httpStatus[httpStatus.NOT_FOUND], statusCode: httpStatus.NOT_FOUND });
const ruleModel: Record<string, any> | null = await getAlertRule(alertId);
if (!ruleModel) return next({ message: httpStatus[httpStatus.NOT_FOUND], statusCode: httpStatus.NOT_FOUND });
const rulePayload = ruleModel.toJSON();
const userID = (req as any)?.userID;
_.set(rulePayload, "updated_by", userID);
if (rulePayload.status == "live") {
await deleteAlertRule(rulePayload, false);
}
Expand Down Expand Up @@ -87,6 +93,8 @@ const deleteAlertHandler = async (req: Request, res: Response, next: NextFunctio
return next({ message: httpStatus[httpStatus.NOT_FOUND], statusCode: httpStatus.NOT_FOUND });
}
const rulePayload = ruleModel.toJSON();
const userID = (req as any)?.userID || "SYSTEM";
_.set(rulePayload, "updated_by", userID);
await deleteAlertRule(rulePayload, hardDelete === "true");
updateTelemetryAuditEvent({ request: req, currentRecord: rulePayload, object: { id: alertId, ...telemetryObject } });
ResponseHandler.successResponse(req, res, { status: httpStatus.OK, data: { id: alertId } });
Expand All @@ -103,12 +111,14 @@ const updateAlertHandler = async (req: Request, res: Response, next: NextFunctio
const ruleModel = await getAlertRule(alertId);
if (!ruleModel) { return next({ message: httpStatus[httpStatus.NOT_FOUND], statusCode: httpStatus.NOT_FOUND }) }
const rulePayload = ruleModel.toJSON();
const userID = (req as any)?.userID;
if (rulePayload.status == "live") {
_.set(rulePayload, "updated_by", userID);
await deleteAlertRule(rulePayload, false);
await retireAlertSilence(alertId);
}
const updatedPayload = getAlertPayload({ ...req.body, manager: rulePayload?.manager });
await Alert.update({ ...updatedPayload, status: "draft" }, { where: { id: alertId } });
await Alert.update({ ...updatedPayload, status: "draft", updated_by: userID }, { where: { id: alertId } });
updateTelemetryAuditEvent({ request: req, currentRecord: rulePayload, object: { id: alertId, ...telemetryObject } });
ResponseHandler.successResponse(req, res, { status: httpStatus.OK, data: { id: alertId } });
} catch (error: any) {
Expand Down
7 changes: 6 additions & 1 deletion api-service/src/controllers/Alerts/Silence.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,12 +20,15 @@ const createHandler = async (request: Request, response: Response, next: NextFun

const start_date = new Date(startDate);
const end_date = new Date(endDate);
const userID = (request as any)?.userID;
const silenceBody = {
id: grafanaResponse.silenceId,
manager: grafanaResponse.manager,
alert_id: alertId,
start_time: start_date,
end_time: end_date,
created_by : userID,
updated_by : userID,
}
const sileneResponse = await Silence.create(silenceBody);
updateTelemetryAuditEvent({ request, object: { id: sileneResponse?.dataValues?.id, ...telemetryObject } });
Expand Down Expand Up @@ -78,10 +81,12 @@ const updateHandler = async (request: Request, response: Response, next: NextFun
await updateSilence(silenceObject, payload);
const updatedStartTime = new Date(payload.startTime);
const updatedEndTime = new Date(payload.endTime);
const userID = (request as any)?.userID;
const updatedSilence = {
...silenceObject,
start_time: updatedStartTime,
end_time: updatedEndTime
end_time: updatedEndTime,
updated_by: userID,
}
const silenceResponse = await Silence.update(updatedSilence, { where: { id } })
ResponseHandler.successResponse(request, response, { status: httpStatus.OK, data: { silenceResponse } })
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
import { Request, Response } from "express";
import { ResponseHandler } from "../../helpers/ResponseHandler";
import _ from "lodash";
import logger from "../../logger";
import axios from "axios";
import httpStatus from "http-status";
import busboy from "busboy";
import { PassThrough } from "stream";
import { registerConnector } from "../../connections/commandServiceConnection";
import { generatePreSignedUrl } from "../GenerateSignedURL/helper";

export const apiId = "api.connector.register";
export const code = "FAILED_TO_REGISTER_CONNECTOR";

let resmsgid: string | any;

const connectorRegisterController = async (req: Request, res: Response) => {
resmsgid = _.get(res, "resmsgid");
try {
const uploadStreamResponse: any = await uploadStream(req);
const payload = {
relative_path: uploadStreamResponse[0]
}
logger.info({ apiId, resmsgid, message: `File uploaded to cloud provider successfully` })
const registryResponse = await registerConnector(payload);
logger.info({ apiId, resmsgid, message: `Connector registered successfully` })
ResponseHandler.successResponse(req, res, { status: httpStatus.OK, data: { message: registryResponse?.data?.message } })
} catch (error: any) {
const errMessage = _.get(error, "response.data.error.message")
logger.error(error, apiId, resmsgid, code);
let errorMessage = error;
const statusCode = _.get(error, "statusCode")
if (!statusCode || statusCode == 500) {
errorMessage = { code, message: errMessage || "Failed to register connector" }
}
ResponseHandler.errorResponse(errorMessage, req, res);
}
};

const uploadStream = async (req: Request) => {
return new Promise((resolve, reject) => {
const filePromises: Promise<void>[] = [];
const busboyClient = busboy({ headers: req.headers });
const relative_path: any[] = [];
let fileCount = 0;

busboyClient.on("file", async (name: any, file: any, info: any) => {
if (fileCount > 0) {
// If more than one file is detected, reject the request
busboyClient.emit("error", reject({
code: "FAILED_TO_UPLOAD",
message: "Uploading multiple files are not allowed",
statusCode: 400,
errCode: "BAD_REQUEST"
}));
return
}
fileCount++;
const processFile = async () => {
const fileName = info?.filename;
try {
const preSignedUrl: any = await generatePreSignedUrl("write", [fileName], "connector")
const filePath = preSignedUrl[0]?.filePath
const fileNameExtracted = extractFileNameFromPath(filePath);
relative_path.push(...fileNameExtracted);
const pass = new PassThrough();
file.pipe(pass);
const fileBuffer = await streamToBuffer(pass);
await axios.put(preSignedUrl[0]?.preSignedUrl, fileBuffer, {
headers: {
"Content-Type": info.mimeType,
"Content-Length": fileBuffer.length,
}
});
}
catch (err) {
logger.error({ apiId, err, resmsgid, message: "Failed to generate sample urls", code: "FILES_GENERATE_URL_FAILURE" })
reject({
code: "FILES_GENERATE_URL_FAILURE",
message: "Failed to generate sample urls",
statusCode: 500,
errCode: "INTERNAL_SERVER_ERROR"
})
}
};
filePromises.push(processFile());
});
busboyClient.on("close", async () => {
try {
await Promise.all(filePromises);
resolve(relative_path);
} catch (error) {
logger.error({ apiId, error, resmsgid, message: "Fail to upload a file", code: "FAILED_TO_UPLOAD" })
reject({
code: "FAILED_TO_UPLOAD",
message: "Fail to upload a file",
statusCode: 400,
errCode: "BAD_REQUEST"
});
}
});
busboyClient.on("error", reject);
req.pipe(busboyClient);
})
}

const streamToBuffer = (stream: PassThrough): Promise<Buffer> => {
return new Promise<Buffer>((resolve, reject) => {
const chunks: Buffer[] = [];
stream.on("data", (chunk) => chunks.push(chunk));
stream.on("end", () => resolve(Buffer.concat(chunks)));
stream.on("error", reject);
});
};

const extractFileNameFromPath = (filePath: string): string[] => {
const regex = /(?<=\/)[^/]+\.[^/]+(?=\/|$)/g;
return filePath.match(regex) || [];
};

export default connectorRegisterController;
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,9 @@ export const createQueryTemplate = async (req: Request, res: Response) => {
}

const data = transformRequest(requestBody, templateName);
const userID = (req as any)?.userID;
_.set(data, "created_by", userID);
_.set(data, "updated_by", userID);
await QueryTemplate.create(data)
logger.info({ apiId, msgid, resmsgid, requestBody: req?.body, message: `Query template created successfully` })
return ResponseHandler.successResponse(req, res, { status: 200, data: { template_id: templateId, template_name: templateName, message: `The query template has been saved successfully` } });
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,41 +23,31 @@ const errorObject = {
}
}
const apiId = "api.data.in";
const errorCode = "DATASET_UPDATE_FAILURE"

const dataIn = async (req: Request, res: Response) => {
try {
const requestBody = req.body;

const requestBody = req.body;
const datasetId = req.params.datasetId.trim();

const isValidSchema = schemaValidation(requestBody, validationSchema)
if (!isValidSchema?.isValid) {
logger.error({ apiId, message: isValidSchema?.message, code: "DATA_INGESTION_INVALID_INPUT" })
return ResponseHandler.errorResponse({ message: isValidSchema?.message, statusCode: 400, errCode: "BAD_REQUEST", code: "DATA_INGESTION_INVALID_INPUT" }, req, res);
}
const dataset = await datasetService.getDataset(datasetId, ["id"], true)
const dataset = await datasetService.getDataset(datasetId, ["id", "entry_topic", "api_version", "dataset_config"], true)
if (!dataset) {
logger.error({ apiId, message: `Dataset with id ${datasetId} not found in live table`, code: "DATASET_NOT_FOUND" })
return ResponseHandler.errorResponse(errorObject.datasetNotFound, req, res);
}
const entryTopic = _.get(dataset, "dataValues.dataset_config.entry_topic")
const { entry_topic, dataset_config, api_version } = dataset
const entryTopic = api_version !== "v2" ? _.get(dataset_config, "entry_topic") : entry_topic
if (!entryTopic) {
logger.error({ apiId, message: "Entry topic not found", code: "TOPIC_NOT_FOUND" })
return ResponseHandler.errorResponse(errorObject.topicNotFound, req, res);
}
await send(addMetadataToEvents(datasetId, requestBody), _.get(dataset, "dataValues.dataset_config.entry_topic"))
await send(addMetadataToEvents(datasetId, requestBody), entryTopic)
ResponseHandler.successResponse(req, res, { status: 200, data: { message: "Data ingested successfully" } });
}
catch (err: any) {
const code = _.get(err, "code") || errorCode
logger.error({ ...err, apiId, code })
let errorMessage = err;
const statusCode = _.get(err, "statusCode")
if (!statusCode || statusCode == 500) {
errorMessage = { code: "DATA_INGESTION_FAILED", message: "Failed to ingest data" }
}
ResponseHandler.errorResponse(errorMessage, req, res);
}

}

const addMetadataToEvents = (datasetId: string, payload: any) => {
Expand Down
Loading

0 comments on commit 6591f06

Please sign in to comment.