Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MINOR: docs links fix #17125

Merged
merged 3 commits into from
Jul 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@ We can create a workflow that will obtain the dbt information from the dbt files
## 1. Define the YAML Config

Select the yaml config from one of the below sources:
- [AWS S3 Buckets](#1.aws-s3-buckets)
- [Google Cloud Storage Buckets](#2.google-cloud-storage-buckets)
- [Azure Storage Buckets](#3.azure-storage-buckets)
- [Local Storage](#4.local-storage)
- [File Server](#5.file-server)
- [dbt Cloud](#6.dbt-cloud)
- [AWS S3 Buckets](#1.-aws-s3-buckets)
- [Google Cloud Storage Buckets](#2.-google-cloud-storage-buckets)
- [Azure Storage Buckets](#3.-azure-storage-buckets)
- [Local Storage](#4.-local-storage)
- [File Server](#5.-file-server)
- [dbt Cloud](#6.-dbt-cloud)


The dbt files should be present on the source mentioned and should have the necessary permissions to be able to access the files.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Check the following docs to run the Ingestion Framework in any orchestrator exte

## Requirements

Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis)
Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token)

## Data Mapping and Assumptions

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Configure and schedule Alation metadata and profiler workflows from the OpenMeta

## Requirements

Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis)
Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token)

## Data Mapping and Assumptions

Expand Down Expand Up @@ -99,7 +99,7 @@ We'll use the user credentials to generate the access token required to authenti
- password: Password of the user.

2. Access Token Authentication:
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-via-ui) can directly be entered. We'll use that directly to authenticate the Alation APIs
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token) can directly be entered. We'll use that directly to authenticate the Alation APIs
- accessToken: Generated access token

{% /codeInfo %}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ src="/images/v1.4/connectors/bigquery/bq-create-service-account-1.png"
alt="Create Service Accounts"
caption="Create Service Accounts" /%}

Grant a role to service account which has all the required permission to ingest BigQuery metadata in OpenMetadata checkout [this](/connectors/database/bigquery/roles) documentation for details on how to create a custom role with required permissions.
Grant a role to service account which has all the required permission to ingest BigQuery metadata in OpenMetadata.

{% image
src="/images/v1.4/connectors/bigquery/bq-service-account-grant-role.png"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ If the user has `External Tables`, please attach relevant permissions needed for
icon="manage_accounts"
title="Create Custom GCP Role"
description="Checkout this documentation on how to create a custom role and assign it to the service account."
link="/connectors/database/bigquery/roles"
link="/connectors/database/bigquery/create-credentials"
/ %}
{% /tilesContainer %}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@ We can create a workflow that will obtain the dbt information from the dbt files
## 1. Define the YAML Config

Select the yaml config from one of the below sources:
- [AWS S3 Buckets](#1.aws-s3-buckets)
- [Google Cloud Storage Buckets](#2.google-cloud-storage-buckets)
- [Azure Storage Buckets](#3.azure-storage-buckets)
- [Local Storage](#4.local-storage)
- [File Server](#5.file-server)
- [dbt Cloud](#6.dbt-cloud)
- [AWS S3 Buckets](#1.-aws-s3-buckets)
- [Google Cloud Storage Buckets](#2.-google-cloud-storage-buckets)
- [Azure Storage Buckets](#3.-azure-storage-buckets)
- [Local Storage](#4.-local-storage)
- [File Server](#5.-file-server)
- [dbt Cloud](#6.-dbt-cloud)


The dbt files should be present on the source mentioned and should have the necessary permissions to be able to access the files.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ Configure and schedule Alation metadata workflow from the OpenMetadata UI:

## Requirements

Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis)
Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token)

## Data Mapping and Assumptions

Expand Down Expand Up @@ -84,7 +84,7 @@ We'll use the user credentials to generate the access token required to authenti
- **password**: Password of the user.

2. **Access Token Authentication**:
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-via-ui) can directly be entered. We'll use that directly to authenticate the Alation APIs
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token) can directly be entered. We'll use that directly to authenticate the Alation APIs
- **accessToken**: Generated access token

#### For Alation backend database Connection:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Configure and schedule Alation metadata and profiler workflows from the OpenMeta

## Requirements

Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis)
Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token)

## Data Mapping and Assumptions

Expand Down Expand Up @@ -99,7 +99,7 @@ We'll use the user credentials to generate the access token required to authenti
- password: Password of the user.

2. Access Token Authentication:
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-via-ui) can directly be entered. We'll use that directly to authenticate the Alation APIs
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token) can directly be entered. We'll use that directly to authenticate the Alation APIs
- accessToken: Generated access token

{% /codeInfo %}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ src="/images/v1.5/connectors/bigquery/bq-create-service-account-1.png"
alt="Create Service Accounts"
caption="Create Service Accounts" /%}

Grant a role to service account which has all the required permission to ingest BigQuery metadata in OpenMetadata checkout [this](/connectors/database/bigquery/roles) documentation for details on how to create a custom role with required permissions.
Grant a role to service account which has all the required permission to ingest BigQuery metadata in OpenMetadata.

{% image
src="/images/v1.5/connectors/bigquery/bq-service-account-grant-role.png"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ You need to create an service account in order to ingest metadata from bigquery
icon="manage_accounts"
title="Create Custom GCP Role"
description="Check out this documentation on how to create a custom role and assign it to the service account."
link="/connectors/database/bigquery/roles"
link="/connectors/database/bigquery/create-credentials"
/ %}
{% /tilesContainer %}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ If the user has `External Tables`, please attach relevant permissions needed for
icon="manage_accounts"
title="Create Custom GCP Role"
description="Checkout this documentation on how to create a custom role and assign it to the service account."
link="/connectors/database/bigquery/roles"
link="/connectors/database/bigquery/create-credentials"
/ %}
{% /tilesContainer %}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@ We can create a workflow that will obtain the dbt information from the dbt files
## 1. Define the YAML Config

Select the yaml config from one of the below sources:
- [AWS S3 Buckets](#1.aws-s3-buckets)
- [Google Cloud Storage Buckets](#2.google-cloud-storage-buckets)
- [Azure Storage Buckets](#3.azure-storage-buckets)
- [Local Storage](#4.local-storage)
- [File Server](#5.file-server)
- [dbt Cloud](#6.dbt-cloud)
- [AWS S3 Buckets](#1.-aws-s3-buckets)
- [Google Cloud Storage Buckets](#2.-google-cloud-storage-buckets)
- [Azure Storage Buckets](#3.-azure-storage-buckets)
- [Local Storage](#4.-local-storage)
- [File Server](#5.-file-server)
- [dbt Cloud](#6.-dbt-cloud)


The dbt files should be present on the source mentioned and should have the necessary permissions to be able to access the files.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ Configure and schedule Alation metadata workflow from the OpenMetadata UI:

## Requirements

Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis)
Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token)

## Data Mapping and Assumptions

Expand Down Expand Up @@ -84,7 +84,7 @@ We'll use the user credentials to generate the access token required to authenti
- **password**: Password of the user.

2. **Access Token Authentication**:
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-via-ui) can directly be entered. We'll use that directly to authenticate the Alation APIs
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token) can directly be entered. We'll use that directly to authenticate the Alation APIs
- **accessToken**: Generated access token

#### For Alation backend database Connection:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Configure and schedule Alation metadata and profiler workflows from the OpenMeta

## Requirements

Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis)
Follow the official documentation to generate a API Access Token from [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token)

## Data Mapping and Assumptions

Expand Down Expand Up @@ -99,7 +99,7 @@ We'll use the user credentials to generate the access token required to authenti
- password: Password of the user.

2. Access Token Authentication:
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-via-ui) can directly be entered. We'll use that directly to authenticate the Alation APIs
The access token created using the steps mentioned [here](https://developer.alation.com/dev/docs/authentication-into-alation-apis#create-an-api-access-token) can directly be entered. We'll use that directly to authenticate the Alation APIs
- accessToken: Generated access token

{% /codeInfo %}
Expand Down
Loading