Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MINOR: Fix Typos #14448

Merged
merged 3 commits into from
Dec 20, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ To deploy OpenMetadata, check the Deployment guides.
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.

**Note:** For metadata ingestion, kindly make sure add alteast `dashboard` scopes to the clientId provided.
**Note:** For metadata ingestion, kindly make sure add atleast `dashboard` scopes to the clientId provided.
Question related to scopes, click [here](https://developer.domo.com/portal/1845fc11bbe5d-api-authentication).

### Python Requirements
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ To deploy OpenMetadata, check the Deployment guides.
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.

**Note:** For metadata ingestion, kindly make sure add alteast `dashboard` scopes to the clientId provided.
**Note:** For metadata ingestion, kindly make sure add atleast `dashboard` scopes to the clientId provided.
Question related to scopes, click [here](https://developer.domo.com/portal/1845fc11bbe5d-api-authentication).

### Python Requirements
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Co
custom Airflow plugins to handle the workflow deployment.

{% note noteType="Warning" %}
For metadata ingestion, kindly make sure add alteast `dashboard` scopes to the clientId provided.
For metadata ingestion, kindly make sure add atleast `dashboard` scopes to the clientId provided.
Question related to scopes, click [here](https://developer.domo.com/portal/1845fc11bbe5d-api-authentication).
{% /note %}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ The ingestion also works with Superset 2.0.0 🎉

**API Connection**: To extract metadata from Superset via API, user must have at least `can read on Chart` & `can read on Dashboard` permissions.

**Database Connection**: To extract metadata from Superset via MySQL or Postgres database, database user must have at least `SELECT` priviledge on `dashboards` & `slices` tables within superset schema.
**Database Connection**: To extract metadata from Superset via MySQL or Postgres database, database user must have at least `SELECT` privilege on `dashboards` & `slices` tables within superset schema.

### Python Requirements

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ The ingestion also works with Superset 2.0.0 🎉

**API Connection**: To extract metadata from Superset via API, user must have at least `can read on Chart` & `can read on Dashboard` permissions.

**Database Connection**: To extract metadata from Superset via MySQL or Postgres database, database user must have at least `SELECT` priviledge on `dashboards` & `slices` tables within superset schema.
**Database Connection**: To extract metadata from Superset via MySQL or Postgres database, database user must have at least `SELECT` privilege on `dashboards` & `slices` tables within superset schema.

### Python Requirements

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ The ingestion also works with Superset 2.0.0 🎉

**API Connection**: To extract metadata from Superset via API, user must have at least `can read on Chart` & `can read on Dashboard` permissions.

**Database Connection**: To extract metadata from Superset via MySQL or Postgres database, database user must have at least `SELECT` priviledge on `dashboards` & `slices` tables within superset schema.
**Database Connection**: To extract metadata from Superset via MySQL or Postgres database, database user must have at least `SELECT` privilege on `dashboards` & `slices` tables within superset schema.

## Metadata Ingestion

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Configure and schedule Tableau metadata and profiler workflows from the OpenMeta

## Requirements

To ingest tableau metadata, minimum `Site Role: Viewer` is requried for the tableau user.
To ingest tableau metadata, minimum `Site Role: Viewer` is required for the tableau user.

{%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%}
To deploy OpenMetadata, check the Deployment guides.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Configure and schedule Tableau metadata and profiler workflows from the OpenMeta

## Requirements

To ingest tableau metadata, minimum `Site Role: Viewer` is requried for the tableau user.
To ingest tableau metadata, minimum `Site Role: Viewer` is required for the tableau user.

{%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%}
To deploy OpenMetadata, check the Deployment guides.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ the following docs to connect using Airflow SDK or with the CLI.

## Requirements

To ingest tableau metadata, minimum `Site Role: Viewer` is requried for the tableau user.
To ingest tableau metadata, minimum `Site Role: Viewer` is required for the tableau user.

{%inlineCallout icon="description" bold="OpenMetadata 0.12 or later" href="/deployment"%}
To deploy OpenMetadata, check the Deployment guides.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,10 +51,12 @@ You can search for the required permissions in the filter box and add them accor
| 5 | resourcemanager.projects.get | Metadata Ingestion |
| 6 | bigquery.jobs.create | Metadata Ingestion |
| 7 | bigquery.jobs.listAll | Metadata Ingestion |
| 8 | datacatalog.taxonomies.get | Fetch Policy Tags |
| 9 | datacatalog.taxonomies.list | Fetch Policy Tags |
| 10 | bigquery.readsessions.create | Bigquery Usage & Lineage Workflow |
| 11 | bigquery.readsessions.getData | Bigquery Usage & Lineage Workflow |
| 8 | bigquery.routines.get | Stored Procedure |
| 9 | bigquery.routines.list | Stored Procedure |
| 10 | datacatalog.taxonomies.get | Fetch Policy Tags |
| 11 | datacatalog.taxonomies.list | Fetch Policy Tags |
| 12 | bigquery.readsessions.create | Bigquery Usage & Lineage Workflow |
| 13 | bigquery.readsessions.getData | Bigquery Usage & Lineage Workflow |
Comment on lines +54 to +59
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add additional bigquery related roles for stored procedure


{% image
src="/images/v1.0/connectors/bigquery/create-role-4.png"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ custom Airflow plugins to handle the workflow deployment.

**Note:**

For metadata ingestion, kindly make sure add alteast `data` scopes to the clientId provided.
For metadata ingestion, kindly make sure add atleast `data` scopes to the clientId provided.
Question related to scopes, click [here](https://developer.domo.com/portal/1845fc11bbe5d-api-authentication).

### Python Requirements
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ custom Airflow plugins to handle the workflow deployment.

**Note:**

For metadata ingestion, kindly make sure add alteast `data` scopes to the clientId provided.
For metadata ingestion, kindly make sure add atleast `data` scopes to the clientId provided.
Question related to scopes, click [here](https://developer.domo.com/portal/1845fc11bbe5d-api-authentication).


Expand All @@ -62,7 +62,7 @@ pip3 install "openmetadata-ingestion[domo]"

All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/athenaConnection.json)
you can find the structure to create a connection to DomoDatbase.
you can find the structure to create a connection to DomoDatabase.

In order to create and run a Metadata Ingestion workflow, we will follow
the steps to create a YAML configuration able to connect to the source,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ custom Airflow plugins to handle the workflow deployment.

**Note:**

For metadata ingestion, kindly make sure add alteast `data` scopes to the clientId provided.
For metadata ingestion, kindly make sure add atleast `data` scopes to the clientId provided.
Question related to scopes, click [here](https://developer.domo.com/portal/1845fc11bbe5d-api-authentication).

## Metadata Ingestion
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ custom Airflow plugins to handle the workflow deployment.
Note that We support MySQL (version 8.0.0 or greater) and the user should have access to the `INFORMATION_SCHEMA` table. By default a user can see only the rows in the `INFORMATION_SCHEMA` that correspond to objects for which the user has the proper access privileges.

```SQL
-- Create user. If <hostName> is ommited, defaults to '%'
-- Create user. If <hostName> is omitted, defaults to '%'
-- More details https://dev.mysql.com/doc/refman/8.0/en/create-user.html
CREATE USER '<username>'[@'<hostName>'] IDENTIFIED BY '<password>';

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ connection to server at \"<host>:<port>\" (@IP),
does not match host name \"<host>:<port>\"
```

If you get this error that time plese pass `{'sslmode': 'verify-ca'}` in the connection arguments.
If you get this error that time please pass `{'sslmode': 'verify-ca'}` in the connection arguments.

{% image
src="/images/v1.0/connectors/redshift/service-connection-arguments.png"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ information received in the shape of an `IngestionPipeline` Entity, and the spec

After creating a new workflow from the UI or when editing it, there are two calls happening:
- `POST` or `PUT` call to update the `Ingestion Pipeline Entity`,
- `/deploy` HTTP call to the `IngestionPipelienResource` to trigger the deployment of the new or updated DAG in the Orchestrator.
- `/deploy` HTTP call to the `IngestionPipelineResource` to trigger the deployment of the new or updated DAG in the Orchestrator.

{% image
src="/images/v1.0/features/ingestion/ingestion-pipeline/ingestion-pipeline-software-system.drawio.png"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,7 @@ configurations specified above.

## How to Visualize Test Results
### From the Test Suite View
From the home page click on the Test Suite menu in the left pannel.
From the home page click on the Test Suite menu in the left panel.

{% image
src="/images/v1.0/features/ingestion/workflows/data-quality/test-suite-home-page.png"
Expand Down Expand Up @@ -266,7 +266,7 @@ From there you can select a Test Suite and visualize the results associated with
### From a Table Entity
Navigate to your table and click on the `profiler` tab. From there you'll be able to see test results at the table or column level.
#### Table Level Test Results
In the top pannel, click on the white background `Data Quality` button. This will bring you to a summary of all your quality tests at the table level
In the top panel, click on the white background `Data Quality` button. This will bring you to a summary of all your quality tests at the table level

{% image
src="/images/v1.0/features/ingestion/workflows/data-quality/table-results-entity.png"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -499,7 +499,7 @@ This test allows us to specify how many values in a column we expect that will m
- mariaDB
- sqlite
- clickhouse
- snowfalke
- snowflake

The other databases will fall back to the `LIKE` expression

Expand Down Expand Up @@ -546,7 +546,7 @@ This test allows us to specify values in a column we expect that will not match
- mariaDB
- sqlite
- clickhouse
- snowfalke
- snowflake

The other databases will fall back to the `LIKE` expression

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,15 @@ slug: /connectors/ingestion/workflows/dbt/dbt-troubleshooting

### 1. dbt tab not displaying in the UI

After the dbt workflow is finished, check the logs to see if the dbt files were successfuly validated or not. Any missing keys in the manifest.json or catalog.json files will displayed in the logs and those keys are needed to be added.
After the dbt workflow is finished, check the logs to see if the dbt files were successfully validated or not. Any missing keys in the manifest.json or catalog.json files will displayed in the logs and those keys are needed to be added.

The dbt workflow requires the below keys to be present in the node of a manifest.json file:
- resource_type (required)
- alias/name (any one of them required)
- schema (required)
- description (required if description needs to be updated)
- compiled_code/compiled_sql (required if the dbt model query is to be shown in dbt tab and for query lineage)
- depends_on (required if lineage information needs to exctracted)
- depends_on (required if lineage information needs to extracted)
- columns (required if column description is to be processed)

{% note %}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ We can create a workflow that will obtain the dbt information from the dbt files
### 1. Create the workflow configuration

Configure the dbt.yaml file according keeping only one of the required source (local, http, gcs, s3).
The dbt files should be present on the source mentioned and should have the necssary permissions to be able to access the files.
The dbt files should be present on the source mentioned and should have the necessary permissions to be able to access the files.

Enter the name of your database service from OpenMetadata in the `serviceName` key in the yaml

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ slug: /connectors/ingestion/workflows/dbt/ingest-dbt-lineage

Ingest the lineage information from dbt `manifest.json` file into OpenMetadata.

OpenMetadata exctracts the lineage information from the `depends_on` and `compiled_query/compiled_code` keys from the manifest file.
OpenMetadata extracts the lineage information from the `depends_on` and `compiled_query/compiled_code` keys from the manifest file.

### 1. Lineage information from dbt "depends_on" key
Openmetadata fetches the lineage information from the `manifest.json` file. Below is a sample `manifest.json` file node containing lineage information under `node_name->depends_on->nodes`.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ Once you have picked the `Interval Type` you will need to define the configurati
- `YEAR`

`COLUMN-VALUE`
- `Value`: a list of value to use for the partitionning logic
- `Value`: a list of value to use for the partitioning logic

`INTEGER-RANGE`
- `Start Range`: the start of the range (inclusive)
Expand Down Expand Up @@ -376,7 +376,7 @@ Profiling all the tables in your data platform might not be the most optimized a

When setting up a profiler workflow, you have the possibility to filter out/in certain databases, schemas, or tables. Using this feature will greatly help you narrow down which table you want to profile.

### 2. Sampling and Partitionning your Tables
### 2. Sampling and Partitioning your Tables
On a table asset, you have the possibility to add a sample percentage/rows and a partitioning logic. Doing so will significantly reduce the amount of data scanned and the computing power required to perform the different operations.

For sampling, you can set a sampling percentage at the workflow level.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ To deploy OpenMetadata, check the Deployment guides.
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.

**Note:** For metadata ingestion, kindly make sure add alteast `data` scopes to the clientId provided.
**Note:** For metadata ingestion, kindly make sure add atleast `data` scopes to the clientId provided.
Question related to scopes, click [here](https://developer.domo.com/portal/1845fc11bbe5d-api-authentication).

### Python Requirements
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ To deploy OpenMetadata, check the Deployment guides.
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with
custom Airflow plugins to handle the workflow deployment.

**Note:** For metadata ingestion, kindly make sure add alteast `data` scopes to the clientId provided.
**Note:** For metadata ingestion, kindly make sure add atleast `data` scopes to the clientId provided.
Question related to scopes, click [here](https://developer.domo.com/portal/1845fc11bbe5d-api-authentication).

### Python Requirements
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ custom Airflow plugins to handle the workflow deployment.



**Note:** For metadata ingestion, kindly make sure add alteast `data` scopes to the clientId provided.
**Note:** For metadata ingestion, kindly make sure add atleast `data` scopes to the clientId provided.
Question related to scopes, click [here](https://developer.domo.com/portal/1845fc11bbe5d-api-authentication).

## Metadata Ingestion
Expand Down
4 changes: 2 additions & 2 deletions openmetadata-docs/content/v1.0.x/deployment/docker/volumes.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ services:
...
```
## Volumes for ingestion container
Following are the changes we have to do while mounting the directory for ingestion in OpenMetadata. Here we will maintaing different directory for dag_generated_configs, dags and secrets.
Following are the changes we have to do while mounting the directory for ingestion in OpenMetadata. Here we will maintaining different directory for dag_generated_configs, dags and secrets.
- Remove the below section from the docker-compose.yml file.
Open the file `docker-compose.yml` downloaded from the Release page [Link](https://github.com/open-metadata/OpenMetadata/releases/download/0.13.0-release/docker-compose.yml) .

Expand All @@ -81,7 +81,7 @@ services:
...
```

Once these changes are done in the docker-compose.yml file It should look simlarly in the below format
Once these changes are done in the docker-compose.yml file It should look similarly in the below format

```commandline
version: "3.9"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ kubectl create -f nfs-server-deployment.yml
kubectl create -f nfs-cluster-ip-service.yml
```

We create a CluserIP Service for pods to access NFS within the cluster at a fixed IP/DNS.
We create a ClusterIP Service for pods to access NFS within the cluster at a fixed IP/DNS.

### Provision NFS backed PV and PVC for Airflow DAGs and Airflow Logs

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ It is important to leave the publicKeys configuration to have both Amazon Cognit
3. Important to update the URLs documented in below configuration. The below config reflects a setup where all dependencies are hosted in a single host. Example openmetadata:8585 might not be the same domain you may be using in your installation.
4. OpenMetadata ships default public/private key, These must be changed in your production deployment to avoid any security issues.

For more details, follow [Enabling JWT Authenticaiton](deployment/security/enable-jwt-tokens)
For more details, follow [Enabling JWT Authentication](deployment/security/enable-jwt-tokens)

{% /note %}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ It is important to leave the publicKeys configuration to have both Amazon Cognit
3. Important to update the URLs documented in below configuration. The below config reflects a setup where all dependencies are hosted in a single host. Example openmetadata:8585 might not be the same domain you may be using in your installation.
4. OpenMetadata ships default public/private key, These must be changed in your production deployment to avoid any security issues.

For more details, follow [Enabling JWT Authenticaiton](deployment/security/enable-jwt-tokens)
For more details, follow [Enabling JWT Authentication](deployment/security/enable-jwt-tokens)

{% /note %}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ It is important to leave the publicKeys configuration to have both Amazon Cognit
3. Important to update the URLs documented in below configuration. The below config reflects a setup where all dependencies are hosted in a single host. Example openmetadata:8585 might not be the same domain you may be using in your installation.
4. OpenMetadata ships default public/private key, These must be changed in your production deployment to avoid any security issues.

For more details, follow [Enabling JWT Authenticaiton](deployment/security/enable-jwt-tokens)
For more details, follow [Enabling JWT Authentication](deployment/security/enable-jwt-tokens)

{% /note %}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ It is important to leave the publicKeys configuration to have both Auth0 public
3. Important to update the URLs documented in below configuration. The below config reflects a setup where all dependencies are hosted in a single host. Example openmetadata:8585 might not be the same domain you may be using in your installation.
4. OpenMetadata ships default public/private key, These must be changed in your production deployment to avoid any security issues.

For more details, follow [Enabling JWT Authenticaiton](deployment/security/enable-jwt-tokens)
For more details, follow [Enabling JWT Authentication](deployment/security/enable-jwt-tokens)

{% /note %}

Expand Down
Loading
Loading