forked from Sunbird-Obsrv/obsrv-api-service
-
Notifications
You must be signed in to change notification settings - Fork 9
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* default config for dataset_config (#99) * Sanketika-Obsrv/issue-tracker#14: feat: validate sql query (#100) * Issue #304 datasource configuration changes to support Hudi schema. (#111) * Sanketika-obsrv/issue-tracker#304: modified mandatory fields for datasource create request (#112) * Configure the vuln scan for obsrv-api-service (#117) * Revert "Sanketika-obsrv/issue-tracker#304: modified mandatory fields for data…" (#121) This reverts commit 0789f3f. * Revert "Issue #304 datasource configuration changes to support Hudi schema. (…" (#122) This reverts commit ef918f6. * Sanketika-Obsrv/issue-tracker#182: added event validation service against schema for particular dataset * Sanketika-Obsrv/issue-tracker#182: added table name from config * Sanketika-Obsrv/issue-tracker#182: added validation status in response handler * Sanketika-Obsrv/issue-tracker#180 feat: API changes to support hudi changes in obsrv (#160) * Sanketika-obsrv/issue-tracker#304: modified name for lakehouse spec * #0000 feat: added type column * #273 feat: API changes to support lakehouse queries * Sanketika-obsrv/issue-tracker#273 feat: modified datasource apis to default type column and fixed sql query api * Sanketika-obsrv/issue-tracker#273 feat: updated postman collection and swagger doc * Sanketika-obsrv/issue-tracker#273 feat: updated testcases for latest changes * Sanketika-obsrv/issue-tracker#273 feat: removed unnecessary log statements * Sanketika-obsrv/issue-tracker#273 feat: removed unnecessary flatmapping and await * Sanketika-obsrv/issue-tracker#273 feat: replaced special characters with underscore in table name * Sanketika-Obsrv/issue-tracker#180 fix: removed unused constants and table checks in hudi schema * Sanketika-Obsrv/issue-tracker#180 fix: removed unused constants and table checks in hudi schema * Release 1.0.6-GA merge fix (#164) * Release 1.0.4-GA (#105) * default config for dataset_config (#99) * Sanketika-Obsrv/issue-tracker#14: feat: validate sql query (#100) --------- Co-authored-by: harishkumar gangula <harish@sanketika.in> Co-authored-by: Shreyas Bhaktharam <121869503+shreyasb22@users.noreply.github.com> * Release 1.0.5-GA (#147) --------- Co-authored-by: Manjunath Davanam <manjunath@sanketika.in> Co-authored-by: Shreyas Bhaktharam <121869503+shreyasb22@users.noreply.github.com> Co-authored-by: Ravi Mula <ravismula@users.noreply.github.com> --------- Co-authored-by: harishkumar gangula <harish@sanketika.in> Co-authored-by: Shreyas Bhaktharam <121869503+shreyasb22@users.noreply.github.com> Co-authored-by: GayathriSrividya <gayathrirajavarapu7@gmail.com> Co-authored-by: divyagovindaiah <110388603+divyagovindaiah@users.noreply.github.com> Co-authored-by: yashashk <yashashk@sanketika.in> Co-authored-by: Manjunath Davanam <manjunath@sanketika.in>
- Loading branch information
1 parent
9d5fbd8
commit bfafddf
Showing
26 changed files
with
2,006 additions
and
761 deletions.
There are no files selected for viewing
Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
48 changes: 43 additions & 5 deletions
48
api-service/postman-collection/Obsrv API Service.postman_collection.json
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
import { Trino, BasicAuth } from 'trino-client'; | ||
import _ from 'lodash'; | ||
import { config } from '../configs/Config'; | ||
|
||
const trino: Trino = Trino.create({ | ||
server: `${config.query_api.lakehouse.host}:${config.query_api.lakehouse.port}`, | ||
catalog: config.query_api.lakehouse.catalog, | ||
schema: config.query_api.lakehouse.schema, | ||
auth: new BasicAuth(config.query_api.lakehouse.default_user), | ||
}); | ||
|
||
|
||
const getFormattedData = (data: any[], columnData: any[]) => { | ||
const formattedData: any[] = []; | ||
for (let i = 0; i < data.length; i++) { | ||
const row = data[ i ]; | ||
const jsonRow: any = {}; | ||
for (let j = 0; j < row.length; j++) { | ||
// assign column only if doesn't start with _hoodie_ | ||
const colName = columnData[ j ]; | ||
if (_.startsWith(colName, "_hoodie_")) { | ||
continue; | ||
} | ||
jsonRow[ colName ] = row[ j ]; | ||
} | ||
formattedData.push(jsonRow); | ||
} | ||
return formattedData; | ||
} | ||
|
||
|
||
export const executeLakehouseQuery = async (query: string) => { | ||
const iter = await trino.query(query); | ||
let queryResult: any = [] | ||
for await (let data of iter) { | ||
if(!_.isEmpty(data.error)){ | ||
throw { | ||
status: 400, | ||
message: data.error.message.replace(/line.*: /, ''), | ||
code: "BAD_REQUEST" | ||
} | ||
} | ||
queryResult = [ ...queryResult, ...(data?.data?.length ? data.data : []) ] | ||
} | ||
let columns = await iter.map((r: any) => r.columns ?? []).next(); | ||
let finalColumns = columns.value.map((column: any) => { | ||
return column.name; | ||
}); | ||
const formattedData = getFormattedData(queryResult, finalColumns); | ||
return formattedData | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,12 @@ | ||
import Ajv from "ajv"; | ||
const validator = new Ajv({ strict: false }); | ||
|
||
export const schemaValidation = (payload: Record<string, any>, schema: Record<string, any>): Record<string, any> => { | ||
const isValid = validator.validate(schema, payload) | ||
if (!isValid) { | ||
const error: any = validator.errors; | ||
const errorMessage = error[0]?.schemaPath?.replace("/", "") + " " + error[0]?.message || "Invalid Request Body"; | ||
return { isValid, message: errorMessage } | ||
} | ||
return { isValid, message: "success" } | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -17,5 +17,6 @@ | |
"metadata": { | ||
"aggregated": false, | ||
"granularity": "day" | ||
} | ||
}, | ||
"type": "druid" | ||
} |
Oops, something went wrong.