-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merge/merging release 2.25 into develop ver 3 #1963
Merge/merging release 2.25 into develop ver 3 #1963
Conversation
…1873) * 1868 statistics with missing counts and datasets missing proprties
* 1843 Home page with properties, side panel with missing counts and summary page for properties with tab containing datasets missing that particular property
* #1603 serde tests for CR and MT DataFrameFilters (mongo-bson-based serde tests for CR and MT DataFrameFilters, mongo-bson-based serde tests extended for CR with a blank mappingTableFilter)
Project config and management updates * poc issue template * CODEOWNERS update * developers update * Badges to README.md
1881 HyperConformance enceladus_info_version from payload
#1887 defaultTimestampTimeZone can be source type specific * `DefaultsByFormat` extends the `Defaults` trait, being able to read defaults from configuration files * `DefaultsByFormat` offers further granularity by first checking the format specific setting only then taking the global one * Basic `GlobalDefaults` are not configuration dependent anymore * Standardization now user `DefaultsByFormat` for its defaults, where rawFormat is used for format parameter * Switched to configuration path to be `enceladus.defaultTimestampTimeZone.default` and `enceladus.defaultTimestampTimeZone.[rawFormat]` respectively * `defaultTimestampTimeZone` is still supported/read as an obsolete fallback Co-authored-by: Daniel K <dk1844@gmail.com>
* Upgrade of Atum to 3.6.0 * Writing the default time zones for timestamps and dates into _INFO file
* #1927 - testing setup: set both spline _LINEAGE and atum _INFO to hdfs file permissions 733 -> the result on EMR HDFS was 711 (due to 022 umask there) -> evidence of working * #1927 - cleanup of test settings of 733 fs permissions * #1927 Atum final version 3.7.0 used instead of the snapshot (same code) * #1927 comment change * #1927 - default 644 FS permissions for both _INFO and _LINEAGE files.
* 1937 limit output file size
* `ADDITIONAL_JVM_EXECUTOR_CONF` * Kerberos configuration * Trust store configuration * kinit execution option * `--min-processing-block-size` & `--max-processing-block-size` * logo improvement
* --max-processing-block-size -> --max-processing-partition-size
* `ADDITIONAL_JVM_EXECUTOR_CONF` * Kerberos configuration * Trust store configuration * kinit execution option * `--min-processing-partition-size` & `--max-processing-partition-size` * improved logo
* `menas.rest.retryCount` - configuration, how many times an url should be retried if failing with retry-able error implemented * `menas.rest.availability.setup` - configuration, how the url list should be handled * _Standardization_, _Conformance_ and _HyperConformance_ changed to provide retry count and availability setup to Dao, read from configuration * `ConfigReader` enhanced and unified to read configurations more easily and universally * Mockito upgraded to 1.16.42 Co-authored-by: Daniel K <dk1844@gmail.com>
* #1863 mapping cr & mt fitler successfully reuses the same fragment (both using the same named model) - todo reuse validation, reuse manipulation methods * #1863 FilterEdit.js allows reusing filterEdit TreeTable logic between mCR and MT editings * #1863 mCT editing validation enabled (commons from FilterEdit.js) * #1863 mCT datatype hinting hinting enabled (commons from DataTypeUtils.js) * #1863 mCR/MT edit dialog default width=950px, some cleanup * #1863 bugfixes: directly creating MT with filter (fix on accepting the field), UI fix for MT filter model initialization * #1863 npm audit fix * #1863 bugfix: adding new mCR (when no edit MCR dialog has been opened yet) did not work - fixed * #1863 selecting mapping column from MT schema works (for all schema levels) for edit. TODO = Schema type support #1863 mCR - schema-based columns suggested for filter, value types filled in silently during submit, too. * #1863 bugfix: empty MT - schema may be empty * #1863 bugfix: removing a filter left a null node - cleanup was needed (otherwise view would fail) logging cleanup * #1863 select list item now shows valueType as additionalText, cleanup * #1863 nonEmptyAndNonNullFilled - map->filter bug fixed. * #1863 typo for null filter Co-authored-by: David Benedeki <14905969+benedeki@users.noreply.github.com>
% Conflicts: % dao/pom.xml % data-model/pom.xml % examples/pom.xml % menas/pom.xml % menas/src/main/scala/za/co/absa/enceladus/menas/controllers/LandingPageController.scala % menas/ui/css/style.css % menas/ui/index.html % menas/ui/npm-shrinkwrap.json % menas/ui/package.json % menas/ui/service/RestDAO.js % migrations-cli/pom.xml % migrations/pom.xml % plugins-api/pom.xml % plugins-builtin/pom.xml % pom.xml % rest-api/src/main/scala/za/co/absa/enceladus/rest_api/controllers/StatisticsController.scala % rest-api/src/main/scala/za/co/absa/enceladus/rest_api/services/DatasetService.scala % rest-api/src/main/scala/za/co/absa/enceladus/rest_api/services/PropertyDefinitionService.scala % rest-api/src/main/scala/za/co/absa/enceladus/rest_api/services/StatisticsService.scala % rest-api/src/main/scala/za/co/absa/enceladus/rest_api/utils/implicits/package.scala % rest-api/src/test/scala/za/co/absa/enceladus/rest_api/integration/controllers/DatasetApiIntegrationSuite.scala % rest-api/src/test/scala/za/co/absa/enceladus/rest_api/integration/repositories/DatasetRepositoryIntegrationSuite.scala % rest-api/src/test/scala/za/co/absa/enceladus/rest_api/integration/repositories/StatisticsIntegrationSuite.scala % scripts/bash/run_enceladus.sh % spark-jobs/pom.xml % spark-jobs/src/main/resources/reference.conf % spark-jobs/src/main/resources/spline.properties.template % spark-jobs/src/main/scala/za/co/absa/enceladus/conformance/HyperConformance.scala % spark-jobs/src/main/scala/za/co/absa/enceladus/conformance/HyperConformanceAttributes.scala % spark-jobs/src/main/scala/za/co/absa/enceladus/standardization/StandardizationJob.scala % spark-jobs/src/main/scala/za/co/absa/enceladus/standardization/interpreter/StandardizationInterpreter.scala % spark-jobs/src/main/scala/za/co/absa/enceladus/standardization_conformance/StandardizationAndConformanceJob.scala % utils/pom.xml % utils/src/main/scala/za/co/absa/enceladus/utils/types/DefaultsByFormat.scala
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just some small things
...scala/za/co/absa/enceladus/rest_api/integration/controllers/DatasetApiIntegrationSuite.scala
Outdated
Show resolved
Hide resolved
.../co/absa/enceladus/rest_api/integration/repositories/DatasetRepositoryIntegrationSuite.scala
Show resolved
Hide resolved
...cala/za/co/absa/enceladus/rest_api/integration/repositories/StatisticsIntegrationSuite.scala
Outdated
Show resolved
Hide resolved
Kudos, SonarCloud Quality Gate passed! 0 Bugs No Coverage information |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested locally, looks good
Testrun
-Dspline.mode=DISABLED
, but since there haven't been spline updates, it should be a problem.