Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ingest): grafana connector #10891

Merged
merged 15 commits into from
Jul 15, 2024
Merged

feat(ingest): grafana connector #10891

merged 15 commits into from
Jul 15, 2024

Conversation

anshbansal
Copy link
Collaborator

@anshbansal anshbansal commented Jul 11, 2024

Barebones grafana connector which ingests dashboards. It allows to document grafana dashboards which can help engineers on-call to discover dashboards. While charts would be great to have but that will take longer so just ingesting dashboards for now. This allows DataHub to be usable for DevOps members on call to document and discover dashboards.

Checklist

  • The PR conforms to DataHub's Contributing Guideline (particularly Commit Message Format)
  • Links to related issues (if applicable)
  • Tests for the changes have been added/updated (if applicable)
  • Docs related to the changes have been added/updated (if applicable). If a new feature has been added a Usage Guide has been added for the same.
  • For any breaking change/potential downtime/deprecation/big changes an entry has been made in Updating DataHub

Summary by CodeRabbit

  • New Features

    • Added Grafana integration for ingesting dashboards, including configuration for Grafana URL and service account token.
    • Introduced default Grafana dashboard configuration and provisioning setup via YAML files (API keys, dashboards, datasources, service accounts).
    • Added Docker Compose setup for Grafana with PostgreSQL.
  • Tests

    • Implemented test suite for Grafana integration, including tests for service account creation, API key handling, and metadata ingestion verification.

Copy link
Contributor

coderabbitai bot commented Jul 11, 2024

Walkthrough

This update introduces a new experimental Grafana source for the metadata ingestion project. It includes configurations, classes, and methods to support Grafana dashboard ingestion and interaction with its API. The changes also include comprehensive testing and provisioning setups, along with improvements in the pipeline handling and context management.

Changes

Files Change Summaries
metadata-ingestion/setup.py Added new Grafana dependency and source.
.../grafana/grafana_source.py Introduced GrafanaSource class and related methods for dashboard ingestion.
.../grafana/default-dashboard.json Added default Grafana dashboard configuration.
.../grafana/grafana_mcps_golden.json Added JSON payload representing an "UPSERT" operation on a Grafana dashboard entity.
.../grafana/provisioning/api-keys/api_keys.yaml Added configuration for API keys.
.../grafana/provisioning/dashboards/dashboard.yaml Added configuration for dashboard provisioning.
.../grafana/provisioning/datasources/datasource.yaml Added configuration for PostgreSQL datasource.
.../grafana/provisioning/service_accounts/service_accounts.yaml Added YAML configuration for provisioning service accounts.
.../grafana/test_grafana.py Introduced tests for Grafana integration and metadata ingestion.
.../pipeline.py Improved context management, exception handling, and method ordering within the pipeline run method.
.../grafana/docker-compose.yml Added Docker Compose configuration for Grafana with PostgreSQL database.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant GrafanaSource
    participant GrafanaAPI
    participant Pipeline

    User ->> GrafanaSource: create(config, context)
    GrafanaSource ->> GrafanaAPI: Fetch Dashboards
    GrafanaAPI -->> GrafanaSource: Dashboards Data
    GrafanaSource ->> Pipeline: Ingest Dashboards
    Pipeline ->> User: Ingestion Complete
Loading

Poem

In the land of dashboards bright,
Grafana's source takes flight! 🌟
With tokens, keys, and JSON clear,
Data flows without a fear.
Pipelines hum in harmony,
A rabbit's joy in code we see! 🐰


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share
Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@github-actions github-actions bot added the ingestion PR or Issue related to the ingestion of metadata label Jul 11, 2024
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 623b6f9 and 9ad2ed3.

Files selected for processing (2)
  • metadata-ingestion/setup.py (2 hunks)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py (1 hunks)
Additional comments not posted (9)
metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py (7)

1-32: Imports look good.

The import statements are appropriate and required for the functionality provided in the file.


34-41: Configuration class looks good.

The GrafanaSourceConfig class is correctly defined with appropriate fields for the URL and service account token.


44-45: Report class looks good.

The GrafanaReport class correctly inherits from StaleEntityRemovalSourceReport and does not require additional fields or methods.


57-62: Constructor looks good.

The constructor correctly initializes the GrafanaSource class attributes.


70-76: Work unit processors method looks good.

The get_workunit_processors method correctly returns a list of work unit processors, including the stale entity removal handler.


78-79: Report method looks good.

The get_report method correctly returns the report attribute.


52-55: Docstring looks good.

The docstring accurately reflects the experimental status and limitations of the GrafanaSource class.

metadata-ingestion/setup.py (2)

348-348: Dependency addition looks good.

The requests dependency is correctly added for the Grafana plugin.


638-638: Entry point addition looks good.

The entry point for the Grafana source plugin is correctly added.

@anshbansal anshbansal changed the title feat(ingest): grafana connector which ingests dashboards feat(ingest): grafana connector Jul 11, 2024
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 9ad2ed3 and 7e889ae.

Files selected for processing (1)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py (1 hunks)
Files skipped from review as they are similar to previous changes (1)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 7e889ae and 2355bbf.

Files selected for processing (1)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py (1 hunks)
Files skipped from review as they are similar to previous changes (1)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 2355bbf and f5b30e5.

Files selected for processing (2)
  • metadata-ingestion/setup.py (2 hunks)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py (1 hunks)
Files skipped from review as they are similar to previous changes (2)
  • metadata-ingestion/setup.py
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between f5b30e5 and 3312b10.

Files selected for processing (1)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py (1 hunks)
Files skipped from review as they are similar to previous changes (1)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py

Copy link
Collaborator

@hsheth2 hsheth2 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small nits, but otherwise looks good

)
response.raise_for_status()
except requests.exceptions.RequestException as e:
self.report.report_failure(f"Failed to fetch dashboards: {str(e)}")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self.report.report_failure(f"Failed to fetch dashboards: {str(e)}")
self.report.failure("Failed to fetch dashboards", exc=e)")

try:
response = requests.get(
f"{self.source_config.url}/api/search", headers=headers
)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

general best practice is to create a request.Session and then use that everywhere - should be ok here though since it only makes one request

name=_uid,
platform_instance=self.source_config.platform_instance,
)
yield from [
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use the auto_workunit helper here

return
res_json = response.json()
for item in res_json:
_uid = item["uid"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the underscore prefix isn't necessary here

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 3312b10 and 2e05fb0.

Files selected for processing (7)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py (1 hunks)
  • metadata-ingestion/tests/integration/grafana/default-dashboard.json (1 hunks)
  • metadata-ingestion/tests/integration/grafana/grafana_mcps_golden.json (1 hunks)
  • metadata-ingestion/tests/integration/grafana/provisioning/api-keys/api_keys.yaml (1 hunks)
  • metadata-ingestion/tests/integration/grafana/provisioning/dashboards/dashboard.yaml (1 hunks)
  • metadata-ingestion/tests/integration/grafana/provisioning/datasources/datasource.yaml (1 hunks)
  • metadata-ingestion/tests/integration/grafana/provisioning/service_accounts/service_accounts.yaml (1 hunks)
Files skipped from review due to trivial changes (4)
  • metadata-ingestion/tests/integration/grafana/default-dashboard.json
  • metadata-ingestion/tests/integration/grafana/grafana_mcps_golden.json
  • metadata-ingestion/tests/integration/grafana/provisioning/dashboards/dashboard.yaml
  • metadata-ingestion/tests/integration/grafana/provisioning/datasources/datasource.yaml
Files skipped from review as they are similar to previous changes (1)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py
Additional comments not posted (2)
metadata-ingestion/tests/integration/grafana/provisioning/api-keys/api_keys.yaml (1)

1-3: Ensure secure handling of API keys.

The API key is defined correctly, but ensure that the actual key values are not hardcoded or exposed in the codebase. Consider using environment variables or a secrets management service.

metadata-ingestion/tests/integration/grafana/provisioning/service_accounts/service_accounts.yaml (1)

1-6: Ensure secure handling of service account credentials.

The service account and associated API key are defined correctly, but ensure that the actual key values are not hardcoded or exposed in the codebase. Consider using environment variables or a secrets management service.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 2e05fb0 and 532b655.

Files selected for processing (1)
  • metadata-ingestion/tests/integration/grafana/test_grafana.py (1 hunks)
Additional comments not posted (5)
metadata-ingestion/tests/integration/grafana/test_grafana.py (5)

21-28: Ensure sensitive information is handled securely.

The Authorization header encodes the admin user and password in base64. Ensure that sensitive information is not logged or exposed.

Do you have measures in place to avoid logging sensitive headers or payloads?


30-43: Handle None return value in calling code.

The create_service_account method logs an error and returns None on failure. Ensure that calling code handles the None return value appropriately.

Do the calling functions handle the None return value correctly?


45-58: Handle None return value in calling code.

The create_api_key method logs an error and returns None on failure. Ensure that calling code handles the None return value appropriately.

Do the calling functions handle the None return value correctly?


101-103: LGTM!

The fixture correctly returns the path to the test resources directory.


137-151: LGTM!

The fixture correctly sets up and tears down the Grafana instance.

f.write(test_api_key)
logger.info(f"API key is {test_api_key}")

breakpoint()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove breakpoint in production code.

Remove the breakpoint() statement before merging the code.

-    breakpoint()
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
breakpoint()

Comment on lines +128 to +134
if api_key:
print("Service Account API Key:", api_key)
return api_key
else:
print("Failed to create API key for the service account")
else:
print("Failed to create service account")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle None return value from create_api_key.

Ensure that the fixture properly handles the case when create_api_key returns None.

-        if api_key:
+        if api_key is not None:
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if api_key:
print("Service Account API Key:", api_key)
return api_key
else:
print("Failed to create API key for the service account")
else:
print("Failed to create service account")
if api_key is not None:
print("Service Account API Key:", api_key)
return api_key
else:
print("Failed to create API key for the service account")
else:
print("Failed to create service account")

Comment on lines +154 to +166
@freeze_time(FROZEN_TIME)
def test_grafana_dashboard(loaded_grafana, pytestconfig, tmp_path, test_resources_dir):
# Wait for Grafana to be up and running
url = "http://localhost:3000/api/health"
for i in range(30):
logging.info("waiting for Grafana to start...")
time.sleep(5)
resp = requests.get(url)
if resp.status_code == 200:
logging.info(f"Grafana started after waiting {i*5} seconds")
break
else:
pytest.fail("Grafana did not start in time")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avoid busy-waiting and use exponential backoff.

The loop for waiting for Grafana to start uses a fixed sleep interval. Consider using exponential backoff to optimize the waiting time.

-    for i in range(30):
-        logging.info("waiting for Grafana to start...")
-        time.sleep(5)
+    for i in range(10):
+        wait_time = min(60, 2 ** i)
+        logging.info(f"waiting for Grafana to start, sleeping for {wait_time} seconds...")
+        time.sleep(wait_time)
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@freeze_time(FROZEN_TIME)
def test_grafana_dashboard(loaded_grafana, pytestconfig, tmp_path, test_resources_dir):
# Wait for Grafana to be up and running
url = "http://localhost:3000/api/health"
for i in range(30):
logging.info("waiting for Grafana to start...")
time.sleep(5)
resp = requests.get(url)
if resp.status_code == 200:
logging.info(f"Grafana started after waiting {i*5} seconds")
break
else:
pytest.fail("Grafana did not start in time")
@freeze_time(FROZEN_TIME)
def test_grafana_dashboard(loaded_grafana, pytestconfig, tmp_path, test_resources_dir):
# Wait for Grafana to be up and running
url = "http://localhost:3000/api/health"
for i in range(10):
wait_time = min(60, 2 ** i)
logging.info(f"waiting for Grafana to start, sleeping for {wait_time} seconds...")
time.sleep(wait_time)
resp = requests.get(url)
if resp.status_code == 200:
logging.info(f"Grafana started after waiting {i*5} seconds")
break
else:
pytest.fail("Grafana did not start in time")

Comment on lines +185 to +199
@freeze_time(FROZEN_TIME)
def test_grafana_ingest(
loaded_grafana, pytestconfig, tmp_path, test_resources_dir, test_api_key
):
# Wait for Grafana to be up and running
url = "http://localhost:3000/api/health"
for i in range(30):
logging.info("waiting for Grafana to start...")
time.sleep(5)
resp = requests.get(url)
if resp.status_code == 200:
logging.info(f"Grafana started after waiting {i*5} seconds")
break
else:
pytest.fail("Grafana did not start in time")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avoid busy-waiting and use exponential backoff.

The loop for waiting for Grafana to start uses a fixed sleep interval. Consider using exponential backoff to optimize the waiting time.

-    for i in range(30):
-        logging.info("waiting for Grafana to start...")
-        time.sleep(5)
+    for i in range(10):
+        wait_time = min(60, 2 ** i)
+        logging.info(f"waiting for Grafana to start, sleeping for {wait_time} seconds...")
+        time.sleep(wait_time)
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@freeze_time(FROZEN_TIME)
def test_grafana_ingest(
loaded_grafana, pytestconfig, tmp_path, test_resources_dir, test_api_key
):
# Wait for Grafana to be up and running
url = "http://localhost:3000/api/health"
for i in range(30):
logging.info("waiting for Grafana to start...")
time.sleep(5)
resp = requests.get(url)
if resp.status_code == 200:
logging.info(f"Grafana started after waiting {i*5} seconds")
break
else:
pytest.fail("Grafana did not start in time")
# Wait for Grafana to be up and running
url = "http://localhost:3000/api/health"
for i in range(10):
wait_time = min(60, 2 ** i)
logging.info(f"waiting for Grafana to start, sleeping for {wait_time} seconds...")
time.sleep(wait_time)
resp = requests.get(url)
if resp.status_code == 200:
logging.info(f"Grafana started after waiting {sum(min(60, 2 ** j) for j in range(i + 1))} seconds")
break
else:
pytest.fail("Grafana did not start in time")

Comment on lines +107 to +134
def test_api_key():
# Example usage:
url = "http://localhost:3000"
admin_user = "admin"
admin_password = "admin"

grafana_client = GrafanaClient(url, admin_user, admin_password)

# Step 1: Create the service account
service_account = grafana_client.create_service_account(
name="example-service-account", role="Viewer"
)
if service_account:
print(f"Service Account Created: {service_account}")

# Step 2: Create the API key for the service account
api_key = grafana_client.create_api_key(
service_account_id=service_account["id"],
key_name="example-api-key",
role="Admin",
)
if api_key:
print("Service Account API Key:", api_key)
return api_key
else:
print("Failed to create API key for the service account")
else:
print("Failed to create service account")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle None return value from create_service_account.

Ensure that the fixture properly handles the case when create_service_account returns None.

-    if service_account:
+    if service_account is not None:
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def test_api_key():
# Example usage:
url = "http://localhost:3000"
admin_user = "admin"
admin_password = "admin"
grafana_client = GrafanaClient(url, admin_user, admin_password)
# Step 1: Create the service account
service_account = grafana_client.create_service_account(
name="example-service-account", role="Viewer"
)
if service_account:
print(f"Service Account Created: {service_account}")
# Step 2: Create the API key for the service account
api_key = grafana_client.create_api_key(
service_account_id=service_account["id"],
key_name="example-api-key",
role="Admin",
)
if api_key:
print("Service Account API Key:", api_key)
return api_key
else:
print("Failed to create API key for the service account")
else:
print("Failed to create service account")
def test_api_key():
# Example usage:
url = "http://localhost:3000"
admin_user = "admin"
admin_password = "admin"
grafana_client = GrafanaClient(url, admin_user, admin_password)
# Step 1: Create the service account
service_account = grafana_client.create_service_account(
name="example-service-account", role="Viewer"
)
if service_account is not None:
print(f"Service Account Created: {service_account}")
# Step 2: Create the API key for the service account
api_key = grafana_client.create_api_key(
service_account_id=service_account["id"],
key_name="example-api-key",
role="Admin",
)
if api_key:
print("Service Account API Key:", api_key)
return api_key
else:
print("Failed to create API key for the service account")
else:
print("Failed to create service account")

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 532b655 and 853bbc1.

Files selected for processing (4)
  • metadata-ingestion/src/datahub/ingestion/run/pipeline.py (4 hunks)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py (1 hunks)
  • metadata-ingestion/tests/integration/grafana/docker-compose.yml (1 hunks)
  • metadata-ingestion/tests/integration/grafana/test_grafana.py (1 hunks)
Files skipped from review due to trivial changes (1)
  • metadata-ingestion/tests/integration/grafana/docker-compose.yml
Files skipped from review as they are similar to previous changes (2)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py
  • metadata-ingestion/tests/integration/grafana/test_grafana.py
Additional comments not posted (8)
metadata-ingestion/src/datahub/ingestion/run/pipeline.py (8)

439-440: Good practice: Context management for the sink.

The addition of context management for the sink ensures proper resource management and cleanup.


465-469: Clear comments: Materializing the generator for exception reporting.

The comments explaining the reasoning behind materializing the extractor's generator into a list are helpful for understanding the code.


470-474: Improved error handling: Reporting source failures.

Adding exception handling for the source's get_report().failure() method ensures proper reporting of errors during metadata production.


499-499: Resource management: Moved extractor close method.

Moving the extractor's close() method call ensures proper resource release after processing is complete.


499-499: Notification: Handle work unit end for the sink.

Including the sink's handle_work_unit_end method call ensures proper notification when a work unit is processed.


499-499: State management: Ensure transformers produce additional records.

Ensuring that transformers produce additional records after data processing is complete allows for any remaining state to be processed.


499-499: Control event handling: EndOfStream notification.

Handling the EndOfStream control event ensures that the sink can flush any remaining records, which is essential for proper data processing.


499-499: Context management: Removed sink close method.

Removing the sink's close() method call from the run method ensures proper management by the context manager.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 853bbc1 and 478dfd1.

Files selected for processing (1)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py (1 hunks)
Files skipped from review as they are similar to previous changes (1)
  • metadata-ingestion/src/datahub/ingestion/source/grafana/grafana_source.py

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 478dfd1 and b3f0564.

Files selected for processing (2)
  • metadata-ingestion/src/datahub/ingestion/run/pipeline.py (6 hunks)
  • metadata-ingestion/tests/integration/grafana/grafana_mcps_golden.json (1 hunks)
Files skipped from review as they are similar to previous changes (2)
  • metadata-ingestion/src/datahub/ingestion/run/pipeline.py
  • metadata-ingestion/tests/integration/grafana/grafana_mcps_golden.json

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between b3f0564 and c084fae.

Files selected for processing (1)
  • metadata-ingestion/tests/integration/grafana/grafana_mcps_golden.json (1 hunks)
Files skipped from review as they are similar to previous changes (1)
  • metadata-ingestion/tests/integration/grafana/grafana_mcps_golden.json

@hsheth2 hsheth2 added the merge-pending-ci A PR that has passed review and should be merged once CI is green. label Jul 13, 2024
@hsheth2 hsheth2 merged commit 437bacb into master Jul 15, 2024
58 of 82 checks passed
@hsheth2 hsheth2 deleted the ab-grafana-source branch July 15, 2024 21:12
yoonhyejin pushed a commit that referenced this pull request Jul 16, 2024
Co-authored-by: Shirshanka Das <shirshanka@apache.org>
Co-authored-by: Harshal Sheth <hsheth2@gmail.com>
aviv-julienjehannet pushed a commit to aviv-julienjehannet/datahub that referenced this pull request Jul 25, 2024
Co-authored-by: Shirshanka Das <shirshanka@apache.org>
Co-authored-by: Harshal Sheth <hsheth2@gmail.com>
arosanda added a commit to infobip/datahub that referenced this pull request Sep 23, 2024
* feat(forms) Handle deleting forms references when hard deleting forms (datahub-project#10820)

* refactor(ui): Misc improvements to the setup ingestion flow (ingest uplift 1/2)  (datahub-project#10764)

Co-authored-by: John Joyce <john@Johns-MBP.lan>
Co-authored-by: John Joyce <john@ip-192-168-1-200.us-west-2.compute.internal>

* fix(ingestion/airflow-plugin): pipeline tasks discoverable in search (datahub-project#10819)

* feat(ingest/transformer): tags to terms transformer (datahub-project#10758)

Co-authored-by: Aseem Bansal <asmbansal2@gmail.com>

* fix(ingestion/unity-catalog): fixed issue with profiling with GE turned on (datahub-project#10752)

Co-authored-by: Aseem Bansal <asmbansal2@gmail.com>

* feat(forms) Add java SDK for form entity PATCH + CRUD examples (datahub-project#10822)

* feat(SDK) Add java SDK for structuredProperty entity PATCH + CRUD examples (datahub-project#10823)

* feat(SDK) Add StructuredPropertyPatchBuilder in python sdk and provide sample CRUD files (datahub-project#10824)

* feat(forms) Add CRUD endpoints to GraphQL for Form entities (datahub-project#10825)

* add flag for includeSoftDeleted in scroll entities API (datahub-project#10831)

* feat(deprecation) Return actor entity with deprecation aspect (datahub-project#10832)

* feat(structuredProperties) Add CRUD graphql APIs for structured property entities (datahub-project#10826)

* add scroll parameters to openapi v3 spec (datahub-project#10833)

* fix(ingest): correct profile_day_of_week implementation (datahub-project#10818)

* feat(ingest/glue): allow ingestion of empty databases from Glue (datahub-project#10666)

Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* feat(cli): add more details to get cli (datahub-project#10815)

* fix(ingestion/glue): ensure date formatting works on all platforms for aws glue (datahub-project#10836)

* fix(ingestion): fix datajob patcher (datahub-project#10827)

* fix(smoke-test): add suffix in temp file creation (datahub-project#10841)

* feat(ingest/glue): add helper method to permit user or group ownership (datahub-project#10784)

* feat(): Show data platform instances in policy modal if they are set on the policy (datahub-project#10645)

Co-authored-by: Hendrik Richert <hendrik.richert@swisscom.com>

* docs(patch): add patch documentation for how implementation works (datahub-project#10010)

Co-authored-by: John Joyce <john@acryl.io>

* fix(jar): add missing custom-plugin-jar task (datahub-project#10847)

* fix(): also check exceptions/stack trace when filtering log messages (datahub-project#10391)

Co-authored-by: John Joyce <john@acryl.io>

* docs(): Update posts.md (datahub-project#9893)

Co-authored-by: Hyejin Yoon <0327jane@gmail.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* chore(ingest): update acryl-datahub-classify version (datahub-project#10844)

* refactor(ingest): Refactor structured logging to support infos, warnings, and failures structured reporting to UI (datahub-project#10828)

Co-authored-by: John Joyce <john@Johns-MBP.lan>
Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* fix(restli): log aspect-not-found as a warning rather than as an error (datahub-project#10834)

* fix(ingest/nifi): remove duplicate upstream jobs (datahub-project#10849)

* fix(smoke-test): test access to create/revoke personal access tokens (datahub-project#10848)

* fix(smoke-test): missing test for move domain (datahub-project#10837)

* ci: update usernames to not considered for community (datahub-project#10851)

* env: change defaults for data contract visibility (datahub-project#10854)

* fix(ingest/tableau): quote special characters in external URL (datahub-project#10842)

* fix(smoke-test): fix flakiness of auto complete test

* ci(ingest): pin dask dependency for feast (datahub-project#10865)

* fix(ingestion/lookml): liquid template resolution and view-to-view cll (datahub-project#10542)

* feat(ingest/audit): add client id and version in system metadata props (datahub-project#10829)

* chore(ingest): Mypy 1.10.1 pin (datahub-project#10867)

* docs: use acryl-datahub-actions as expected python package to install (datahub-project#10852)

* docs: add new js snippet (datahub-project#10846)

* refactor(ingestion): remove company domain for security reason (datahub-project#10839)

* fix(ingestion/spark): Platform instance and column level lineage fix (datahub-project#10843)

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* feat(ingestion/tableau): optionally ingest multiple sites and create site containers (datahub-project#10498)

Co-authored-by: Yanik Häni <Yanik.Haeni1@swisscom.com>

* fix(ingestion/looker): Add sqlglot dependency and remove unused sqlparser (datahub-project#10874)

* fix(manage-tokens): fix manage access token policy (datahub-project#10853)

* Batch get entity endpoints (datahub-project#10880)

* feat(system): support conditional write semantics (datahub-project#10868)

* fix(build): upgrade vercel builds to Node 20.x (datahub-project#10890)

* feat(ingest/lookml): shallow clone repos (datahub-project#10888)

* fix(ingest/looker): add missing dependency (datahub-project#10876)

* fix(ingest): only populate audit stamps where accurate (datahub-project#10604)

* fix(ingest/dbt): always encode tag urns (datahub-project#10799)

* fix(ingest/redshift): handle multiline alter table commands (datahub-project#10727)

* fix(ingestion/looker): column name missing in explore (datahub-project#10892)

* fix(lineage) Fix lineage source/dest filtering with explored per hop limit (datahub-project#10879)

* feat(conditional-writes): misc updates and fixes (datahub-project#10901)

* feat(ci): update outdated action (datahub-project#10899)

* feat(rest-emitter): adding async flag to rest emitter (datahub-project#10902)

Co-authored-by: Gabe Lyons <gabe.lyons@acryl.io>

* feat(ingest): add snowflake-queries source (datahub-project#10835)

* fix(ingest): improve `auto_materialize_referenced_tags_terms` error handling (datahub-project#10906)

* docs: add new company to adoption list (datahub-project#10909)

* refactor(redshift): Improve redshift error handling with new structured reporting system (datahub-project#10870)

Co-authored-by: John Joyce <john@Johns-MBP.lan>
Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* feat(ui) Finalize support for all entity types on forms (datahub-project#10915)

* Index ExecutionRequestResults status field (datahub-project#10811)

* feat(ingest): grafana connector (datahub-project#10891)

Co-authored-by: Shirshanka Das <shirshanka@apache.org>
Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* fix(gms) Add Form entity type to EntityTypeMapper (datahub-project#10916)

* feat(dataset): add support for external url in Dataset (datahub-project#10877)

* docs(saas-overview) added missing features to observe section (datahub-project#10913)

Co-authored-by: John Joyce <john@acryl.io>

* fix(ingest/spark): Fixing Micrometer warning (datahub-project#10882)

* fix(structured properties): allow application of structured properties without schema file (datahub-project#10918)

* fix(data-contracts-web) handle other schedule types (datahub-project#10919)

* fix(ingestion/tableau): human-readable message for PERMISSIONS_MODE_SWITCHED error (datahub-project#10866)

Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* Add feature flag for view defintions (datahub-project#10914)

Co-authored-by: Ethan Cartwright <ethan.cartwright@acryl.io>

* feat(ingest/BigQuery): refactor+parallelize dataset metadata extraction (datahub-project#10884)

* fix(airflow): add error handling around render_template() (datahub-project#10907)

* feat(ingestion/sqlglot): add optional `default_dialect` parameter to sqlglot lineage (datahub-project#10830)

* feat(mcp-mutator): new mcp mutator plugin (datahub-project#10904)

* fix(ingest/bigquery): changes helper function to decode unicode scape sequences (datahub-project#10845)

* feat(ingest/postgres): fetch table sizes for profile (datahub-project#10864)

* feat(ingest/abs): Adding azure blob storage ingestion source (datahub-project#10813)

* fix(ingest/redshift): reduce severity of SQL parsing issues (datahub-project#10924)

* fix(build): fix lint fix web react (datahub-project#10896)

* fix(ingest/bigquery): handle quota exceeded for project.list requests (datahub-project#10912)

* feat(ingest): report extractor failures more loudly (datahub-project#10908)

* feat(ingest/snowflake): integrate snowflake-queries into main source (datahub-project#10905)

* fix(ingest): fix docs build (datahub-project#10926)

* fix(ingest/snowflake): fix test connection (datahub-project#10927)

* fix(ingest/lookml): add view load failures to cache (datahub-project#10923)

* docs(slack) overhauled setup instructions and screenshots (datahub-project#10922)

Co-authored-by: John Joyce <john@acryl.io>

* fix(airflow): Add comma parsing of owners to DataJobs (datahub-project#10903)

* fix(entityservice): fix merging sideeffects (datahub-project#10937)

* feat(ingest): Support System Ingestion Sources, Show and hide system ingestion sources with Command-S (datahub-project#10938)

Co-authored-by: John Joyce <john@Johns-MBP.lan>

* chore() Set a default lineage filtering end time on backend when a start time is present (datahub-project#10925)

Co-authored-by: John Joyce <john@ip-192-168-1-200.us-west-2.compute.internal>
Co-authored-by: John Joyce <john@Johns-MBP.lan>

* Added relationships APIs to V3. Added these generic APIs to V3 swagger doc. (datahub-project#10939)

* docs: add learning center to docs (datahub-project#10921)

* doc: Update hubspot form id (datahub-project#10943)

* chore(airflow): add python 3.11 w/ Airflow 2.9 to CI (datahub-project#10941)

* fix(ingest/Glue): column upstream lineage between S3 and Glue (datahub-project#10895)

* fix(ingest/abs): split abs utils into multiple files (datahub-project#10945)

* doc(ingest/looker): fix doc for sql parsing documentation (datahub-project#10883)

Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* fix(ingest/bigquery): Adding missing BigQuery types (datahub-project#10950)

* fix(ingest/setup): feast and abs source setup (datahub-project#10951)

* fix(connections) Harden adding /gms to connections in backend (datahub-project#10942)

* feat(siblings) Add flag to prevent combining siblings in the UI (datahub-project#10952)

* fix(docs): make graphql doc gen more automated (datahub-project#10953)

* feat(ingest/athena): Add option for Athena partitioned profiling (datahub-project#10723)

* fix(spark-lineage): default timeout for future responses (datahub-project#10947)

* feat(datajob/flow): add environment filter using info aspects (datahub-project#10814)

* fix(ui/ingest): correct privilege used to show tab (datahub-project#10483)

Co-authored-by: Kunal-kankriya <127090035+Kunal-kankriya@users.noreply.github.com>

* feat(ingest/looker): include dashboard urns in browse v2 (datahub-project#10955)

* add a structured type to batchGet in OpenAPI V3 spec (datahub-project#10956)

* fix(ui): scroll on the domain sidebar to show all domains (datahub-project#10966)

* fix(ingest/sagemaker): resolve incorrect variable assignment for SageMaker API call (datahub-project#10965)

* fix(airflow/build): Pinning mypy (datahub-project#10972)

* Fixed a bug where the OpenAPI V3 spec was incorrect. The bug was introduced in datahub-project#10939. (datahub-project#10974)

* fix(ingest/test): Fix for mssql integration tests (datahub-project#10978)

* fix(entity-service) exist check correctly extracts status (datahub-project#10973)

* fix(structuredProps) casing bug in StructuredPropertiesValidator (datahub-project#10982)

* bugfix: use anyOf instead of allOf when creating references in openapi v3 spec (datahub-project#10986)

* fix(ui): Remove ant less imports (datahub-project#10988)

* feat(ingest/graph): Add get_results_by_filter to DataHubGraph (datahub-project#10987)

* feat(ingest/cli): init does not actually support environment variables (datahub-project#10989)

* fix(ingest/graph): Update get_results_by_filter graphql query (datahub-project#10991)

* feat(ingest/spark): Promote beta plugin (datahub-project#10881)

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* feat(ingest): support domains in meta -> "datahub" section (datahub-project#10967)

* feat(ingest): add `check server-config` command (datahub-project#10990)

* feat(cli): Make consistent use of DataHubGraphClientConfig (datahub-project#10466)

Deprecates get_url_and_token() in favor of a more complete option: load_graph_config() that returns a full DatahubClientConfig.
This change was then propagated across previous usages of get_url_and_token so that connections to DataHub server from the client respect the full breadth of configuration specified by DatahubClientConfig.

I.e: You can now specify disable_ssl_verification: true in your ~/.datahubenv file so that all cli functions to the server work when ssl certification is disabled.

Fixes datahub-project#9705

* fix(ingest/s3): Fixing container creation when there is no folder in path (datahub-project#10993)

* fix(ingest/looker): support platform instance for dashboards & charts (datahub-project#10771)

* feat(ingest/bigquery): improve handling of information schema in sql parser (datahub-project#10985)

* feat(ingest): improve `ingest deploy` command (datahub-project#10944)

* fix(backend): allow excluding soft-deleted entities in relationship-queries; exclude soft-deleted members of groups (datahub-project#10920)

- allow excluding soft-deleted entities in relationship-queries
- exclude soft-deleted members of groups

* fix(ingest/looker): downgrade missing chart type log level (datahub-project#10996)

* doc(acryl-cloud): release docs for 0.3.4.x (datahub-project#10984)

Co-authored-by: John Joyce <john@acryl.io>
Co-authored-by: RyanHolstien <RyanHolstien@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: Pedro Silva <pedro@acryl.io>

* fix(protobuf/build): Fix protobuf check jar script (datahub-project#11006)

* fix(ui/ingest): Support invalid cron jobs (datahub-project#10998)

* fix(ingest): fix graph config loading (datahub-project#11002)

Co-authored-by: Pedro Silva <pedro@acryl.io>

* feat(docs): Document __DATAHUB_TO_FILE_ directive (datahub-project#10968)

Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* fix(graphql/upsertIngestionSource): Validate cron schedule; parse error in CLI (datahub-project#11011)

* feat(ece): support custom ownership type urns in ECE generation (datahub-project#10999)

* feat(assertion-v2): changed Validation tab to Quality and created new Governance tab (datahub-project#10935)

* fix(ingestion/glue): Add support for missing config options for profiling in Glue (datahub-project#10858)

* feat(propagation): Add models for schema field docs, tags, terms (datahub-project#2959) (datahub-project#11016)

Co-authored-by: Chris Collins <chriscollins3456@gmail.com>

* docs: standardize terminology to DataHub Cloud (datahub-project#11003)

* fix(ingestion/transformer): replace the externalUrl container (datahub-project#11013)

* docs(slack) troubleshoot docs (datahub-project#11014)

* feat(propagation): Add graphql API (datahub-project#11030)

Co-authored-by: Chris Collins <chriscollins3456@gmail.com>

* feat(propagation):  Add models for Action feature settings (datahub-project#11029)

* docs(custom properties): Remove duplicate from sidebar (datahub-project#11033)

* feat(models): Introducing Dataset Partitions Aspect (datahub-project#10997)

Co-authored-by: John Joyce <john@Johns-MBP.lan>
Co-authored-by: John Joyce <john@ip-192-168-1-200.us-west-2.compute.internal>

* feat(propagation): Add Documentation Propagation Settings (datahub-project#11038)

* fix(models): chart schema fields mapping, add dataHubAction entity, t… (datahub-project#11040)

* fix(ci): smoke test lint failures (datahub-project#11044)

* docs: fix learning center color scheme & typo (datahub-project#11043)

* feat: add cloud main page (datahub-project#11017)

Co-authored-by: Jay <159848059+jayacryl@users.noreply.github.com>

* feat(restore-indices): add additional step to also clear system metadata service (datahub-project#10662)

Co-authored-by: John Joyce <john@acryl.io>

* docs: fix typo (datahub-project#11046)

* fix(lint): apply spotless (datahub-project#11050)

* docs(airflow): example query to get datajobs for a dataflow (datahub-project#11034)

* feat(cli): Add run-id option to put sub-command (datahub-project#11023)

Adds an option to assign run-id to a given put command execution. 
This is useful when transformers do not exist for a given ingestion payload, we can follow up with custom metadata and assign it to an ingestion pipeline.

* fix(ingest): improve sql error reporting calls (datahub-project#11025)

* fix(airflow): fix CI setup (datahub-project#11031)

* feat(ingest/dbt): add experimental `prefer_sql_parser_lineage` flag (datahub-project#11039)

* fix(ingestion/lookml): enable stack-trace in lookml logs (datahub-project#10971)

* (chore): Linting fix (datahub-project#11015)

* chore(ci): update deprecated github actions (datahub-project#10977)

* Fix ALB configuration example (datahub-project#10981)

* chore(ingestion-base): bump base image packages (datahub-project#11053)

* feat(cli): Trim report of dataHubExecutionRequestResult to max GMS size (datahub-project#11051)

* fix(ingestion/lookml): emit dummy sql condition for lookml custom condition tag (datahub-project#11008)

Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* fix(ingestion/powerbi): fix issue with broken report lineage (datahub-project#10910)

* feat(ingest/tableau): add retry on timeout (datahub-project#10995)

* change generate kafka connect properties from env (datahub-project#10545)

Co-authored-by: david-leifker <114954101+david-leifker@users.noreply.github.com>

* fix(ingest): fix oracle cronjob ingestion (datahub-project#11001)

Co-authored-by: david-leifker <114954101+david-leifker@users.noreply.github.com>

* chore(ci): revert update deprecated github actions (datahub-project#10977) (datahub-project#11062)

* feat(ingest/dbt-cloud): update metadata_endpoint inference (datahub-project#11041)

* build: Reduce size of datahub-frontend-react image by 50-ish% (datahub-project#10878)

Co-authored-by: david-leifker <114954101+david-leifker@users.noreply.github.com>

* fix(ci): Fix lint issue in datahub_ingestion_run_summary_provider.py (datahub-project#11063)

* docs(ingest): update developing-a-transformer.md (datahub-project#11019)

* feat(search-test): update search tests from datahub-project#10408 (datahub-project#11056)

* feat(cli): add aspects parameter to DataHubGraph.get_entity_semityped (datahub-project#11009)

Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* docs(airflow): update min version for plugin v2 (datahub-project#11065)

* doc(ingestion/tableau): doc update for derived permission (datahub-project#11054)

Co-authored-by: Pedro Silva <pedro.cls93@gmail.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* fix(py): remove dep on types-pkg_resources (datahub-project#11076)

* feat(ingest/mode): add option to exclude restricted (datahub-project#11081)

* fix(ingest): set lastObserved in sdk when unset (datahub-project#11071)

* doc(ingest): Update capabilities (datahub-project#11072)

* chore(vulnerability): Log Injection (datahub-project#11090)

* chore(vulnerability): Information exposure through a stack trace (datahub-project#11091)

* chore(vulnerability): Comparison of narrow type with wide type in loop condition (datahub-project#11089)

* chore(vulnerability): Insertion of sensitive information into log files (datahub-project#11088)

* chore(vulnerability): Risky Cryptographic Algorithm (datahub-project#11059)

* chore(vulnerability): Overly permissive regex range (datahub-project#11061)

Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* fix: update customer data (datahub-project#11075)

* fix(models): fixing the datasetPartition models (datahub-project#11085)

Co-authored-by: John Joyce <john@ip-192-168-1-200.us-west-2.compute.internal>

* fix(ui): Adding view, forms GraphQL query, remove showing a fallback error message on unhandled GraphQL error (datahub-project#11084)

Co-authored-by: John Joyce <john@ip-192-168-1-200.us-west-2.compute.internal>

* feat(docs-site): hiding learn more from cloud page (datahub-project#11097)

* fix(docs): Add correct usage of orFilters in search API docs (datahub-project#11082)

Co-authored-by: Jay <159848059+jayacryl@users.noreply.github.com>

* fix(ingest/mode): Regexp in mode name matcher didn't allow underscore (datahub-project#11098)

* docs: Refactor customer stories section (datahub-project#10869)

Co-authored-by: Jeff Merrick <jeff@wireform.io>

* fix(release): fix full/slim suffix on tag (datahub-project#11087)

* feat(config): support alternate hashing algorithm for doc id (datahub-project#10423)

Co-authored-by: david-leifker <114954101+david-leifker@users.noreply.github.com>
Co-authored-by: John Joyce <john@acryl.io>

* fix(emitter): fix typo in get method of java kafka emitter (datahub-project#11007)

* fix(ingest): use correct native data type in all SQLAlchemy sources by compiling data type using dialect (datahub-project#10898)

Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* chore: Update contributors list in PR labeler (datahub-project#11105)

* feat(ingest): tweak stale entity removal messaging (datahub-project#11064)

* fix(ingestion): enforce lastObserved timestamps in SystemMetadata (datahub-project#11104)

* fix(ingest/powerbi): fix broken lineage between chart and dataset (datahub-project#11080)

* feat(ingest/lookml): CLL support for sql set in sql_table_name attribute of lookml view (datahub-project#11069)

* docs: update graphql docs on forms & structured properties (datahub-project#11100)

* test(search): search openAPI v3 test (datahub-project#11049)

* fix(ingest/tableau): prevent empty site content urls (datahub-project#11057)

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* feat(entity-client): implement client batch interface (datahub-project#11106)

* fix(snowflake): avoid reporting warnings/info for sys tables (datahub-project#11114)

* fix(ingest): downgrade column type mapping warning to info (datahub-project#11115)

* feat(api): add AuditStamp to the V3 API entity/aspect response (datahub-project#11118)

* fix(ingest/redshift): replace r'\n' with '\n' to avoid token error redshift serverless… (datahub-project#11111)

* fix(entiy-client): handle null entityUrn case for restli (datahub-project#11122)

* fix(sql-parser): prevent bad urns from alter table lineage (datahub-project#11092)

* fix(ingest/bigquery): use small batch size if use_tables_list_query_v2 is set (datahub-project#11121)

* fix(graphql): add missing entities to EntityTypeMapper and EntityTypeUrnMapper (datahub-project#10366)

* feat(ui): Changes to allow editable dataset name (datahub-project#10608)

Co-authored-by: Jay Kadambi <jayasimhan_venkatadri@optum.com>

* fix: remove saxo (datahub-project#11127)

* feat(mcl-processor): Update mcl processor hooks (datahub-project#11134)

* fix(openapi): fix openapi v2 endpoints & v3 documentation update

* Revert "fix(openapi): fix openapi v2 endpoints & v3 documentation update"

This reverts commit 573c1cb.

* docs(policies): updates to policies documentation (datahub-project#11073)

* fix(openapi): fix openapi v2 and v3 docs update (datahub-project#11139)

* feat(auth): grant type and acr values custom oidc parameters support (datahub-project#11116)

* fix(mutator): mutator hook fixes (datahub-project#11140)

* feat(search): support sorting on multiple fields (datahub-project#10775)

* feat(ingest): various logging improvements (datahub-project#11126)

* fix(ingestion/lookml): fix for sql parsing error (datahub-project#11079)

Co-authored-by: Harshal Sheth <hsheth2@gmail.com>

* feat(docs-site) cloud page spacing and content polishes (datahub-project#11141)

* feat(ui) Enable editing structured props on fields (datahub-project#11042)

* feat(tests): add md5 and last computed to testResult model (datahub-project#11117)

* test(openapi): openapi regression smoke tests (datahub-project#11143)

* fix(airflow): fix tox tests + update docs (datahub-project#11125)

* docs: add chime to adoption stories (datahub-project#11142)

* fix(ingest/databricks): Updating code to work with Databricks sdk 0.30 (datahub-project#11158)

* fix(kafka-setup): add missing script to image (datahub-project#11190)

* fix(config): fix hash algo config (datahub-project#11191)

* test(smoke-test): updates to smoke-tests (datahub-project#11152)

* fix(elasticsearch): refactor idHashAlgo setting (datahub-project#11193)

* chore(kafka): kafka version bump (datahub-project#11211)

* readd UsageStatsWorkUnit

* fix merge problems

* change logo

---------

Co-authored-by: Chris Collins <chriscollins3456@gmail.com>
Co-authored-by: John Joyce <john@acryl.io>
Co-authored-by: John Joyce <john@Johns-MBP.lan>
Co-authored-by: John Joyce <john@ip-192-168-1-200.us-west-2.compute.internal>
Co-authored-by: dushayntAW <158567391+dushayntAW@users.noreply.github.com>
Co-authored-by: sagar-salvi-apptware <159135491+sagar-salvi-apptware@users.noreply.github.com>
Co-authored-by: Aseem Bansal <asmbansal2@gmail.com>
Co-authored-by: Kevin Chun <kevin1chun@gmail.com>
Co-authored-by: jordanjeremy <72943478+jordanjeremy@users.noreply.github.com>
Co-authored-by: skrydal <piotr.skrydalewicz@gmail.com>
Co-authored-by: Harshal Sheth <hsheth2@gmail.com>
Co-authored-by: david-leifker <114954101+david-leifker@users.noreply.github.com>
Co-authored-by: sid-acryl <155424659+sid-acryl@users.noreply.github.com>
Co-authored-by: Julien Jehannet <80408664+aviv-julienjehannet@users.noreply.github.com>
Co-authored-by: Hendrik Richert <github@richert.li>
Co-authored-by: Hendrik Richert <hendrik.richert@swisscom.com>
Co-authored-by: RyanHolstien <RyanHolstien@users.noreply.github.com>
Co-authored-by: Felix Lüdin <13187726+Masterchen09@users.noreply.github.com>
Co-authored-by: Pirry <158024088+chardaway@users.noreply.github.com>
Co-authored-by: Hyejin Yoon <0327jane@gmail.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: cburroughs <chris.burroughs@gmail.com>
Co-authored-by: ksrinath <ksrinath@users.noreply.github.com>
Co-authored-by: Mayuri Nehate <33225191+mayurinehate@users.noreply.github.com>
Co-authored-by: Kunal-kankriya <127090035+Kunal-kankriya@users.noreply.github.com>
Co-authored-by: Shirshanka Das <shirshanka@apache.org>
Co-authored-by: ipolding-cais <155455744+ipolding-cais@users.noreply.github.com>
Co-authored-by: Tamas Nemeth <treff7es@gmail.com>
Co-authored-by: Shubham Jagtap <132359390+shubhamjagtap639@users.noreply.github.com>
Co-authored-by: haeniya <yanik.haeni@gmail.com>
Co-authored-by: Yanik Häni <Yanik.Haeni1@swisscom.com>
Co-authored-by: Gabe Lyons <itsgabelyons@gmail.com>
Co-authored-by: Gabe Lyons <gabe.lyons@acryl.io>
Co-authored-by: 808OVADOZE <52988741+shtephlee@users.noreply.github.com>
Co-authored-by: noggi <anton.kuraev@acryl.io>
Co-authored-by: Nicholas Pena <npena@foursquare.com>
Co-authored-by: Jay <159848059+jayacryl@users.noreply.github.com>
Co-authored-by: ethan-cartwright <ethan.cartwright.m@gmail.com>
Co-authored-by: Ethan Cartwright <ethan.cartwright@acryl.io>
Co-authored-by: Nadav Gross <33874964+nadavgross@users.noreply.github.com>
Co-authored-by: Patrick Franco Braz <patrickfbraz@poli.ufrj.br>
Co-authored-by: pie1nthesky <39328908+pie1nthesky@users.noreply.github.com>
Co-authored-by: Joel Pinto Mata (KPN-DSH-DEX team) <130968841+joelmataKPN@users.noreply.github.com>
Co-authored-by: Ellie O'Neil <110510035+eboneil@users.noreply.github.com>
Co-authored-by: Ajoy Majumdar <ajoymajumdar@hotmail.com>
Co-authored-by: deepgarg-visa <149145061+deepgarg-visa@users.noreply.github.com>
Co-authored-by: Tristan Heisler <tristankheisler@gmail.com>
Co-authored-by: Andrew Sikowitz <andrew.sikowitz@acryl.io>
Co-authored-by: Davi Arnaut <davi.arnaut@acryl.io>
Co-authored-by: Pedro Silva <pedro@acryl.io>
Co-authored-by: amit-apptware <132869468+amit-apptware@users.noreply.github.com>
Co-authored-by: Sam Black <sam.black@acryl.io>
Co-authored-by: Raj Tekal <varadaraj_tekal@optum.com>
Co-authored-by: Steffen Grohsschmiedt <gitbhub@steffeng.eu>
Co-authored-by: jaegwon.seo <162448493+wornjs@users.noreply.github.com>
Co-authored-by: Renan F. Lima <51028757+lima-renan@users.noreply.github.com>
Co-authored-by: Matt Exchange <xkollar@users.noreply.github.com>
Co-authored-by: Jonny Dixon <45681293+acrylJonny@users.noreply.github.com>
Co-authored-by: Pedro Silva <pedro.cls93@gmail.com>
Co-authored-by: Pinaki Bhattacharjee <pinakipb2@gmail.com>
Co-authored-by: Jeff Merrick <jeff@wireform.io>
Co-authored-by: skrydal <piotr.skrydalewicz@acryl.io>
Co-authored-by: AndreasHegerNuritas <163423418+AndreasHegerNuritas@users.noreply.github.com>
Co-authored-by: jayasimhankv <145704974+jayasimhankv@users.noreply.github.com>
Co-authored-by: Jay Kadambi <jayasimhan_venkatadri@optum.com>
Co-authored-by: David Leifker <david.leifker@acryl.io>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ingestion PR or Issue related to the ingestion of metadata merge-pending-ci A PR that has passed review and should be merged once CI is green.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants