Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

#2007: PG data model and functions for Schema #2008

Open
wants to merge 53 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 42 commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
d109b73
Init of Enceladus 3.0.0 develop branch
benedeki Mar 13, 2021
0d265e5
#1612 Separating Menas UI and API (#1620)
AdrianOlosutean Apr 12, 2021
1c7d8a9
MANIFEST.MFs LF commit
dk1844 Apr 14, 2021
836bbec
Merge branch 'develop' into develop-ver-3.0
dk1844 Apr 14, 2021
07d4888
[develop-ver3.0] mergefix
dk1844 Apr 14, 2021
8a1c171
changes in Dockerfile - added new ARGs and in menas new ENV with defa…
dvagapov Apr 19, 2021
952e52a
chnaged docker.properties.template and one more ENV for dockerfile
dvagapov Apr 19, 2021
2b0f371
spark master and password as before
AdrianOlosutean Apr 19, 2021
43a2740
Merge remote-tracking branch 'origin/develop' into feature/merging-de…
AdrianOlosutean Apr 19, 2021
90188dc
integration
AdrianOlosutean Apr 19, 2021
5d3f4e0
Merge branch 'develop' into feature/merging-develop-ver.30
dk1844 Apr 20, 2021
5e90cbd
SPLINE_URLTEMPLATE moved from docker.prop to Dockerfile
dvagapov Apr 20, 2021
42df938
Merge branch 'develop' into feature/merging-develop-ver.30
dk1844 Apr 20, 2021
da03de0
Merge pull request #1752 from AbsaOSS/feature/merging-develop-ver.30
dk1844 Apr 20, 2021
027f695
#1774 test fix, the main is unchanged. (#1775)
dk1844 Apr 26, 2021
d3b57a0
Feature/601 swagger api docs (#1762)
AdrianOlosutean May 5, 2021
fa2bbd5
#417 SparkXML-related unit test added first (regression guard), Spark…
dk1844 May 14, 2021
88fabeb
#1769 Rename Menas to rest-api (#1781)
AdrianOlosutean May 14, 2021
70999f5
Feautre/1733 lineage dumper - 2nd edition (#1766)
dk1844 May 25, 2021
c4dcd92
1770 Rename menas web to menas (#1786)
AdrianOlosutean May 26, 2021
03f24ba
Merge branch 'develop' into develop-ver-3.0
dk1844 Jun 1, 2021
7f2d6e5
1732 Spline 0.6 integration (#1739)
AdrianOlosutean Jun 2, 2021
f6f0e52
Merge branch 'develop' into develop-ver-3.0
dk1844 Jun 7, 2021
1404bb3
[merging develop->develop-ver-3.0] buildfix
dk1844 Jun 7, 2021
e52c9ff
Merge branch 'develop' into develop-ver-3.0
dk1844 Jun 7, 2021
6eea153
[merging develop->develop-ver-3.0] mergefix
dk1844 Jun 7, 2021
5fa0909
[merging develop->develop-ver-3.0] mergefix2 (authorizeRequests now r…
dk1844 Jun 8, 2021
c461d22
Merge branch 'master' into merge/release-2.23.-0-into-develop-ver-3
benedeki Aug 22, 2021
d87e7c3
Merge branch 'master' into merge/release-2.23.-0-into-develop-ver-3
benedeki Sep 18, 2021
bd39018
Merge pull request #1890 from AbsaOSS/merge/release-2.23.-0-into-deve…
benedeki Sep 20, 2021
facb46a
Merge/release 2.24 into develop ver 3 (#1928)
AdrianOlosutean Oct 21, 2021
10213a2
Merge/merging release 2.25 into develop ver 3 (#1963)
dk1844 Nov 4, 2021
d9a72fc
Merge/merging release 2.26.X into develop ver3 (#1997)
Adrian-Olosutean Jan 10, 2022
d3f920a
#2007: PG data model and functions for Schema
benedeki Jan 11, 2022
c42098b
* Added the missing databases script
benedeki Jan 11, 2022
72d737e
* end of line at the at file end
benedeki Jan 11, 2022
e8fa5cf
* Fixes
benedeki Jan 12, 2022
930e963
* Reducing sequence cache size
benedeki Jan 12, 2022
fefd5f3
* More renames from _deleted_ to _disabled_
benedeki Jan 12, 2022
29c5e32
* fixed missing version in get_schema.sql
benedeki Jan 12, 2022
d1cb905
* Addressing PR comments
benedeki Jan 13, 2022
eae99fa
* dataset_schema.heads -> dataset_schema.schemas
benedeki Jan 14, 2022
55696a0
#2022 spark-commons (#2023)
Adrian-Olosutean Mar 9, 2022
a10e6b5
#2027: Ensures TimezoneNormalizer is used in tests (#2030)
benedeki Mar 15, 2022
0634144
* status code per new conventions
benedeki Apr 10, 2022
7e3d599
#2050: Unifying project space with GitHub defaults (#2051)
benedeki Apr 14, 2022
03cf238
* hstore
benedeki Apr 14, 2022
c698b21
* refactored for inheritance
benedeki Apr 19, 2022
cb5293c
Merge branch 'develop-ver-3.0' into feature/2007-pg-data-model-and-fu…
benedeki Apr 19, 2022
c368ddf
* JSONB -> JSON
benedeki Apr 19, 2022
6f7bff6
Merge branch 'feature/2007-pg-data-model-and-functions-for-schema' of…
benedeki Apr 19, 2022
ad463f8
* entities and versions are connected via id not entity name
benedeki Apr 20, 2022
720915d
#2034: PG data model and functions for Mapping tables (#2053)
benedeki Apr 26, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
2 changes: 1 addition & 1 deletion .editorconfig
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ charset = utf-8
end_of_line = lf
trim_trailing_whitespace = true

[*.xml]
[*.{xml,sql,ddl}]
indent_size = 4
indent_style = space
insert_final_newline = true
Expand Down
2 changes: 1 addition & 1 deletion .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
@@ -1 +1 @@
* @lokm01 @benedeki @DzMakatun @HuvarVer @dk1844 @AdrianOlosutean
* @benedeki @dk1844 @AdrianOlosutean
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Steps to reproduce the behavior OR commands run:
3. Enter value '...'
4. See error

## Expected behaviour
## Expected behavior
A clear and concise description of what you expected to happen.

## Screenshots
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/feature_request.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ A description of the requested feature.
A simple example if applicable.

## Proposed Solution [Optional]
Solution Ideas
Solution Ideas:
1.
2.
3.
18 changes: 18 additions & 0 deletions .github/ISSUE_TEMPLATE/poc.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
---
name: POC
about: Proof of Concept, usually a middle-sized effort to test some idea
labels: 'poc, under discussion, priority: undecided'

---

## Background
A clear and concise intro into the situation.

## Goal
The goal that the _Proof of Concept_ wants to test

## Proposed Approach [Optional]
Approach Ideas:
1.
2.
3.
2 changes: 1 addition & 1 deletion .github/workflows/license_check.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,4 +28,4 @@ jobs:
- uses: actions/setup-java@v1
with:
java-version: 1.8
- run: mvn -Plicense-check apache-rat:check
- run: mvn --no-transfer-progress -Plicense-check apache-rat:check
4 changes: 2 additions & 2 deletions .github/workflows/pr_labels_check.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ jobs:
name: Test approved or docs
runs-on: ubuntu-latest
steps:
- uses: danielchabr/pr-labels-checker@master
- uses: danielchabr/pr-labels-checker@v3.0
id: checkLabel
with:
hasSome: PR:tested,PR:no testing needed,docs
Expand All @@ -31,7 +31,7 @@ jobs:
name: Merge not blocked
runs-on: ubuntu-latest
steps:
- uses: danielchabr/pr-labels-checker@master
- uses: danielchabr/pr-labels-checker@v3.0
id: checkLabel
with:
hasNone: PR:reviewing,work in progress
Expand Down
2 changes: 0 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,6 @@ build.log
# syntax: regexp
# ^\.pc/

build.log

.cache*
dependency-reduced-pom.xml

Expand Down
58 changes: 37 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,17 +13,25 @@

# Enceladus

### <a name="latest_release"/>Latest Release
[![Maven Central](https://maven-badges.herokuapp.com/maven-central/za.co.absa.enceladus/parent/badge.png)](https://maven-badges.herokuapp.com/maven-central/za.co.absa.enceladus/parent/)

### <a name="build_status"/>Build Status
| master | develop |
| ------------- | ------------- |
| [![Build Status](https://opensource.bigusdatus.com/jenkins/buildStatus/icon?job=Absa-OSS-Projects%2Fenceladus%2Fmaster)](https://opensource.bigusdatus.com/jenkins/job/Absa-OSS-Projects/job/enceladus/job/master/) | [![Build Status](https://opensource.bigusdatus.com/jenkins/buildStatus/icon?job=Absa-OSS-Projects%2Fenceladus%2Fdevelop)](https://opensource.bigusdatus.com/jenkins/job/Absa-OSS-Projects/job/enceladus/job/develop/) |
### <a name="code_quality_status"/>Code Quality Status
[![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=AbsaOSS_enceladus&metric=alert_status)](https://sonarcloud.io/dashboard?id=AbsaOSS_enceladus)

### <a name="documentation"/>Documentation
[![Read the Docs](https://img.shields.io/badge/docs-latest-brightgreen.svg)](https://absaoss.github.io/enceladus/)
[![Read the Docs](https://img.shields.io/badge/docs-release%20notes-yellow.svg)](https://absaoss.github.io/enceladus/blog/)
[![Read the Docs](https://img.shields.io/badge/docs-release--1.x-red.svg)](https://absaoss.github.io/enceladus/docs/1.0.0/components)
___

<!-- toc -->
- [What is Enceladus?](#what-is-enceladus)
- [REST API](#rest-api)
- [Menas](#menas)
- [Standardization](#standardization)
- [Conformance](#conformance)
Expand All @@ -32,20 +40,26 @@ ___
- [Plugins](#plugins)
- [Built-in Plugins](#built-in-plugins)
- [How to contribute](#how-to-contribute)
- [Documentation](#documentation)
<!-- tocstop -->

## What is Enceladus?
**Enceladus** is a **Dynamic Conformance Engine** which allows data from different formats to be standardized to parquet and conformed to group-accepted common reference (e.g. data for country designation which are **DE** in one source system and **Deutschland** in another, can be conformed to **Germany**).

The project is comprised of three main components:
### Menas
This is the user-facing web client, used to **specify the standardization schema**, and **define the steps required to conform** a dataset.
There are three models used to do this:
The project is comprised of four main components:

### REST API
The REST API exposes the Enceladus endpoints for creating, reading, updating and deleting the models, as well as other functionalities.
The main three models used are:
- **Dataset**: Specifies where the dataset will be read from on HDFS (**RAW**), the conformance rules that will be applied to it, and where it will land on HDFS once it is conformed (**PUBLISH**)
- **Schema**: Specifies the schema towards which the dataset will be standardized
- **Mapping Table**: Specifies where tables with master reference data can be found (parquet on HDFS), which are used when applying Mapping conformance rules (e.g. the dataset uses **Germany**, which maps to the master reference **DE** in the mapping table)

The REST API exposes a Swagger Documentation UI which documents all the HTTP exposed endpoints. It can be found at **REST_API_HOST/swagger-ui.html**

### Menas
This is the user-facing web client, used to **specify the standardization schema**, and **define the steps required to conform** a dataset.
The Menas web client calls and is based on the REST API to get the needed entities.

### Standardization
This is a Spark job which reads an input dataset in any of the supported formats and **produces a parquet dataset with the Menas-specified schema** as output.

Expand All @@ -69,43 +83,48 @@ Ensure the properties there fit your environment.
- Without tests: `mvn clean package -DskipTests `
- With unit tests: `mvn clean package`
- With integration tests: `mvn clean package -Pintegration`
- With component preload file generated: `mvn clean package -PgenerateComponentPreload`

#### Test coverage:
- Test coverage: `mvn scoverage:report`

The coverage reports are written in each module's `target` directory and aggregated in the root `target` directory.

## How to run
#### Menas requirements:
#### REST API requirements:
- [**Tomcat 8.5/9.0** installation](https://tomcat.apache.org/download-90.cgi)
- [**MongoDB 4.0** installation](https://docs.mongodb.com/manual/administration/install-community/)
- [**Spline UI deployment**](https://absaoss.github.io/spline/) - place the [spline.war](https://search.maven.org/remotecontent?filepath=za/co/absa/spline/spline-web/0.3.9/spline-web-0.3.9.war)
in your Tomcat webapps directory (rename after downloading to _spline.war_); NB! don't forget to set up the `spline.mongodb.url` configuration for the _war_
- **HADOOP_CONF_DIR** environment variable, pointing to the location of your hadoop configuration (pointing to a hadoop installation)

The _Spline UI_ can be omitted; in such case the **Menas** `spline.urlTemplate` setting should be set to empty string.
The _Spline UI_ can be omitted; in such case the **REST API** `spline.urlTemplate` setting should be set to empty string.

#### Deploying REST API
Simply copy the **rest-api.war** file produced when building the project into Tomcat's webapps directory.
Another possible method is building the Docker image based on the existing Dockerfile and deploying it as a container.

#### Deploying Menas
Simply copy the **menas.war** file produced when building the project into Tomcat's webapps directory.
There are several ways of deploying Menas:
- Tomcat deployment: copy the **menas.war** file produced when building the project into Tomcat's webapps directory. The **"apiUrl"** value in package.json should be set either before building or after building the artifact and modifying it in place
- Docker deployment: build the Docker image based on the existing Dockerfile and deploy it as a container. The **API_URL** environment variable should be provided when running the container
- CDN deployment: copy the built contents in the **dist** directory into your preferred CDN server. The **"apiUrl"** value in package.json in the **dist** directory should be set

#### Speed up initial loading time of menas
- Build the project with the generateComponentPreload profile. Component preload will greatly reduce the number of HTTP requests required for the initial load of Menas
#### Speed up initial loading time of REST API
- Enable the HTTP compression
- Configure `spring.resources.cache.cachecontrol.max-age` in `application.properties` of Menas for caching of static resources
- Configure `spring.resources.cache.cachecontrol.max-age` in `application.properties` of REST API for caching of static resources

#### Standardization and Conformance requirements:
- [**Spark 2.4.4 (Scala 2.11)** installation](https://spark.apache.org/downloads.html)
- [**Hadoop 2.7** installation](https://hadoop.apache.org/releases.html)
- **Menas** running instance
- **REST API** running instance
- **Menas Credentials File** in your home directory or on HDFS (a configuration file for authenticating the Spark jobs with Menas)
- **Use with in-memory authentication**
e.g. `~/menas-credential.properties`:
```
username=user
password=changeme
```
- **Menas Keytab File** in your home directory or on HDFS
- **REST API Keytab File** in your home directory or on HDFS
- **Use with kerberos authentication**, see [link](https://kb.iu.edu/d/aumh) for details on creating keytab files
- **Directory structure** for the **RAW** dataset should follow the convention of `<path_to_dataset_in_menas>/<year>/<month>/<day>/v<dataset_version>`. This date is specified with the `--report-date` option when running the **Standardization** and **Conformance** jobs.
- **_INFO file** must be present along with the **RAW** data on HDFS as per the above directory structure. This is a file tracking control measures via [Atum](https://github.com/AbsaOSS/atum), an example can be found [here](examples/data/input/_INFO).
Expand All @@ -131,7 +150,7 @@ password=changeme
--row-tag <tag>
```
* Here `row-tag` is a specific option for `raw-format` of type `XML`. For more options for different types please see our WIKI.
* In case Menas is configured for in-memory authentication (e.g. in dev environments), replace `--menas-auth-keytab` with `--menas-credentials-file`
* In case REST API is configured for in-memory authentication (e.g. in dev environments), replace `--menas-auth-keytab` with `--menas-credentials-file`

#### Running Conformance
```
Expand Down Expand Up @@ -175,7 +194,7 @@ password=changeme
--row-tag <tag>
```

* In case Menas is configured for in-memory authentication (e.g. in dev environments), replace `--menas-auth-keytab` with `--menas-credentials-file`
* In case REST API is configured for in-memory authentication (e.g. in dev environments), replace `--menas-auth-keytab` with `--menas-credentials-file`

#### Helper scripts for running Standardization, Conformance or both together

Expand Down Expand Up @@ -272,8 +291,8 @@ The list of all options for running Standardization, Conformance and the combine

| Option | Description |
|---------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| --menas-auth-keytab **filename** | A keytab file used for Kerberized authentication to Menas. Cannot be used together with `--menas-credentials-file`. |
| --menas-credentials-file **filename** | A credentials file containing a login and a password used to authenticate to Menas. Cannot be used together with `--menas-auth-keytab`. |
| --menas-auth-keytab **filename** | A keytab file used for Kerberized authentication to REST API. Cannot be used together with `--menas-credentials-file`. |
| --menas-credentials-file **filename** | A credentials file containing a login and a password used to authenticate to REST API. Cannot be used together with `--menas-auth-keytab`. |
| --dataset-name **name** | A dataset name to be standardized or conformed. |
| --dataset-version **version** | A version of a dataset to be standardized or conformed. |
| --report-date **YYYY-mm-dd** | A date specifying a day for which a raw data is landed. |
Expand Down Expand Up @@ -336,6 +355,3 @@ A module containing [examples](examples/README.md) of the project usage.

## How to contribute
Please see our [**Contribution Guidelines**](CONTRIBUTING.md).

## Documentation
Please see the [documentation pages](https://absaoss.github.io/enceladus/).
2 changes: 1 addition & 1 deletion dao/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>za.co.absa.enceladus</groupId>
<artifactId>parent</artifactId>
<version>2.23.0</version>
<version>3.0.0-SNAPSHOT</version>
</parent>

<properties>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,14 +71,14 @@ sealed abstract class AuthClient(username: String, restTemplate: RestTemplate, a

private def getAuthHeaders(response: ResponseEntity[String]): HttpHeaders = {
val headers = response.getHeaders
val sessionCookie = headers.get("set-cookie").asScala.head
val jwt = headers.get("JWT").asScala.head
val csrfToken = headers.get("X-CSRF-TOKEN").asScala.head

log.info(s"Session Cookie: $sessionCookie")
log.info(s"JWT: $jwt")
log.info(s"CSRF Token: $csrfToken")

val resultHeaders = new HttpHeaders()
resultHeaders.add("cookie", sessionCookie)
resultHeaders.add("JWT", jwt)
resultHeaders.add("X-CSRF-TOKEN", csrfToken)
resultHeaders
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,43 +18,75 @@ package za.co.absa.enceladus.dao.rest
import org.apache.commons.lang.exception.ExceptionUtils
import org.slf4j.LoggerFactory
import org.springframework.web.client.{ResourceAccessException, RestClientException}
import za.co.absa.enceladus.dao.rest.CrossHostApiCaller.logger
import za.co.absa.enceladus.dao.{DaoException, RetryableException}

import scala.annotation.tailrec
import scala.util.{Failure, Random, Try}

protected object CrossHostApiCaller {
object CrossHostApiCaller {

def apply(apiBaseUrls: List[String]): CrossHostApiCaller = {
new CrossHostApiCaller(apiBaseUrls, Random.nextInt(apiBaseUrls.size))
private val logger = LoggerFactory.getLogger(classOf[CrossHostApiCaller])

final val DefaultUrlsRetryCount: Int = 0

private def createInstance(apiBaseUrls: Seq[String], urlsRetryCount: Int, startWith: Option[Int]): CrossHostApiCaller = {
val maxTryCount: Int = (if (urlsRetryCount < 0) {
logger.warn(s"Urls retry count cannot be negative ($urlsRetryCount). Using default number of retries instead ($DefaultUrlsRetryCount).") //scalastyle:ignore maxLineLength
DefaultUrlsRetryCount
} else {
urlsRetryCount
}) + 1
val currentHostIndex = startWith.getOrElse(Random.nextInt(Math.max(apiBaseUrls.size, 1)))
new CrossHostApiCaller(apiBaseUrls.toVector, maxTryCount, currentHostIndex)
}

def apply(apiBaseUrls: Seq[String], urlsRetryCount: Int = DefaultUrlsRetryCount, startWith: Option[Int] = None): CrossHostApiCaller = {
createInstance(apiBaseUrls, urlsRetryCount, startWith)
}
}

protected class CrossHostApiCaller(apiBaseUrls: List[String], var currentHostIndex: Int) extends ApiCaller {
private val logger = LoggerFactory.getLogger(this.getClass)
protected class CrossHostApiCaller private(apiBaseUrls: Vector[String], maxTryCount: Int, private var currentHostIndex: Int)
extends ApiCaller {

def baseUrlsCount: Int = apiBaseUrls.size

def currentBaseUrl: String = apiBaseUrls(currentHostIndex)

def nextBaseUrl(): String = {
currentHostIndex = (currentHostIndex + 1) % baseUrlsCount
currentBaseUrl
}

private val maxAttempts = apiBaseUrls.size - 1

def call[T](fn: String => T): T = {
def logFailure(error: Throwable, url: String, attemptNumber: Int, nextUrl: Option[String]): Unit = {
val rootCause = ExceptionUtils.getRootCauseMessage(error)
val switching = nextUrl.map(s => s", switching host to $s").getOrElse("")
logger.warn(s"Request failed on host $url (attempt $attemptNumber of $maxTryCount)$switching - $rootCause")
}

def attempt(index: Int, attemptCount: Int = 0): Try[T] = {
currentHostIndex = index
val currentBaseUrl = apiBaseUrls(index)
Try {
fn(currentBaseUrl)
@tailrec
def attempt(url: String, attemptNumber: Int, urlsTried: Int): Try[T] = {
val result =Try {
fn(url)
}.recoverWith {
case e @ (_: ResourceAccessException | _: RestClientException) => Failure(DaoException("Server non-responsive", e))
}.recoverWith {
case e: RetryableException if attemptCount < maxAttempts =>
val nextIndex = (index + 1) % apiBaseUrls.size
val nextBaseUrl = apiBaseUrls(nextIndex)
val rootCause = ExceptionUtils.getRootCauseMessage(e)
logger.warn(s"Request failed on host $currentBaseUrl, switching host to $nextBaseUrl - $rootCause")
attempt(nextIndex, attemptCount + 1)
}
//using match instead of recoverWith to make the function @tailrec
result match {
case Failure(e: RetryableException) if attemptNumber < maxTryCount =>
logFailure(e, url, attemptNumber, None)
attempt(url, attemptNumber + 1, urlsTried)
case Failure(e: RetryableException) if urlsTried < baseUrlsCount =>
val nextUrl = nextBaseUrl()
logFailure(e, url, attemptNumber, Option(nextUrl))
attempt(nextUrl, 1, urlsTried + 1)
case _ => result
}
}

attempt(currentHostIndex).get
attempt(currentBaseUrl,1, 1).get
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ object MenasConnectionStringParser {
.replaceAll("/$", "")
.replaceAll("/api$", "")
)
.toSet
.distinct
.toList
}

Expand Down
Loading