Skip to content

Commit

Permalink
Expand ENV abbreviation in the docs (#1846)
Browse files Browse the repository at this point in the history
  • Loading branch information
burnash authored Sep 30, 2024
1 parent 93cd5a6 commit 854905f
Show file tree
Hide file tree
Showing 5 changed files with 35 additions and 35 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ There are multiple ways to pass the custom destination function to the `dlt` pip
)
)
```
- Via a fully qualified string to function location (can be used from `config.toml` or ENV vars). The destination function should be located in another file.
- Via a fully qualified string to function location (this can be set in `config.toml` or through environment variables). The destination function should be located in another file.
```py
# File my_pipeline.py

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ info = pipeline.run(some_source(), loader_file_format="{props.file_type}")
loader_file_format="{props.file_type}"
</pre>

3. You can set the `loader_file_format` via ENV variable:
3. You can set the `loader_file_format` via environment variable:

<pre language="sh">
export NORMALIZE__LOADER_FILE_FORMAT="{props.file_type}"
Expand Down
30 changes: 15 additions & 15 deletions docs/website/docs/general-usage/credentials/setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,8 @@ The most specific possible path for **sources** looks like:
groupId="config-provider-type"
defaultValue="toml"
values={[
{"label": "Toml config provider", "value": "toml"},
{"label": "ENV variables", "value": "env"},
{"label": "TOML config provider", "value": "toml"},
{"label": "Environment variables", "value": "env"},
{"label": "In the code", "value": "code"},
]}>
<TabItem value="toml">
Expand Down Expand Up @@ -78,8 +78,8 @@ The most specific possible path for **destinations** looks like:
groupId="config-provider-type"
defaultValue="toml"
values={[
{"label": "Toml config provider", "value": "toml"},
{"label": "ENV variables", "value": "env"},
{"label": "TOML config provider", "value": "toml"},
{"label": "Environment variables", "value": "env"},
{"label": "In the code", "value": "code"},
]}>
<TabItem value="toml">
Expand Down Expand Up @@ -285,8 +285,8 @@ Let's assume we have a [notion](../../dlt-ecosystem/verified-sources/notion) sou
groupId="config-provider-type"
defaultValue="toml"
values={[
{"label": "Toml config provider", "value": "toml"},
{"label": "ENV variables", "value": "env"},
{"label": "TOML config provider", "value": "toml"},
{"label": "Environment variables", "value": "env"},
{"label": "In the code", "value": "code"},
]}>

Expand Down Expand Up @@ -319,7 +319,7 @@ aws_secret_access_key = "1234567890_access_key" # copy the secret access key her
<TabItem value="env">

```sh
# ENV vars are set up the same way both for configs and secrets
# Environment variables are set up the same way both for configs and secrets
export RUNTIME__LOG_LEVEL="INFO"
export DESTINATION__FILESYSTEM__BUCKET_URL="s3://[your_bucket_name]"
export NORMALIZE__DATA_WRITER__DISABLE_COMPRESSION="true"
Expand Down Expand Up @@ -376,8 +376,8 @@ Let's assume we use the `bigquery` destination and the `google_sheets` source. T
groupId="config-provider-type"
defaultValue="toml"
values={[
{"label": "Toml config provider", "value": "toml"},
{"label": "ENV variables", "value": "env"},
{"label": "TOML config provider", "value": "toml"},
{"label": "Environment variables", "value": "env"},
{"label": "In the code", "value": "code"},
]}>

Expand Down Expand Up @@ -424,8 +424,8 @@ os.environ["CREDENTIALS__PROJECT_ID"] = os.environ.get("GOOGLE_PROJECT_ID")
groupId="config-provider-type"
defaultValue="toml"
values={[
{"label": "Toml config provider", "value": "toml"},
{"label": "ENV variables", "value": "env"},
{"label": "TOML config provider", "value": "toml"},
{"label": "Environment variables", "value": "env"},
{"label": "In the code", "value": "code"},
]}>

Expand Down Expand Up @@ -506,8 +506,8 @@ Let's assume we have several different Google sources and destinations. We can u
groupId="config-provider-type"
defaultValue="toml"
values={[
{"label": "Toml config provider", "value": "toml"},
{"label": "ENV variables", "value": "env"},
{"label": "TOML config provider", "value": "toml"},
{"label": "Environment variables", "value": "env"},
{"label": "In the code", "value": "code"},
]}>

Expand Down Expand Up @@ -590,8 +590,8 @@ Let's assume we have several sources of the same type. How can we separate them
groupId="config-provider-type"
defaultValue="toml"
values={[
{"label": "Toml config provider", "value": "toml"},
{"label": "ENV variables", "value": "env"},
{"label": "TOML config provider", "value": "toml"},
{"label": "Environment variables", "value": "env"},
{"label": "In the code", "value": "code"},
]}>

Expand Down
4 changes: 2 additions & 2 deletions docs/website/docs/tutorial/filesystem.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,8 +112,8 @@ Let's specify the bucket URL and credentials. We can do this using the following
groupId="config-provider-type"
defaultValue="toml"
values={[
{"label": "Toml config provider", "value": "toml"},
{"label": "ENV variables", "value": "env"},
{"label": "TOML config provider", "value": "toml"},
{"label": "Environment variables", "value": "env"},
{"label": "In the code", "value": "code"},
]}>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -184,17 +184,17 @@ For a complete picture of Dagster's integration with dlt, please refer to their
### Frequently Asked Questions
- **Can I remove the generated `.dlt` folder with `secrets.toml` and `config.toml` files?**
Yes. Since dlt is compatible with ENV variables, you can use this for secrets required by both Dagster and dlt.
Yes. Since dlt is compatible with environment variables, you can use this for secrets required by both Dagster and dlt.
- **I'm working with several sources – how can I best group these assets?**

To effectively group assets in Dagster when working with multiple sources, use the `group_name` parameter in your `@dlt_assets` decorator. This helps organize and visualize assets related to a particular source or theme in the Dagster UI. Here’s a simplified example:

```py
import dlt
from dagster_embedded_elt.dlt import dlt_assets
from dlt_sources.google_analytics import google_analytics
# Define assets for the first Google Analytics source
@dlt_assets(
dlt_source=google_analytics(),
Expand All @@ -207,7 +207,7 @@ For a complete picture of Dagster's integration with dlt, please refer to their
)
def google_analytics_assets_1(context, dlt):
yield from dlt.run(context=context)
# Define assets for the second Google Analytics source
@dlt_assets(
dlt_source=google_analytics(),
Expand All @@ -222,18 +222,18 @@ For a complete picture of Dagster's integration with dlt, please refer to their
yield from dlt.run(context=context)
```



- **How can I use `bigquery_adapter` with `@dlt_assets` in Dagster for partitioned tables?**
To use `bigquery_adapter` with `@dlt_assets` in Dagster for partitioned tables, modify your resource setup to include `bigquery_adapter` with the partition parameter. Here's a quick example:

To use `bigquery_adapter` with `@dlt_assets` in Dagster for partitioned tables, modify your resource setup to include `bigquery_adapter` with the partition parameter. Here's a quick example:
```py
import dlt
from google.analytics import BetaAnalyticsDataClient
from dlt.destinations.adapters import bigquery_adapter
from dagster import dlt_asset
@dlt_asset
def google_analytics_asset(context):
# Configuration (replace with your actual values or parameters)
Expand All @@ -244,20 +244,20 @@ For a complete picture of Dagster's integration with dlt, please refer to their
start_date = "2024-01-01"
rows_per_page = 1000
credentials = your_credentials
# Initialize Google Analytics client
client = BetaAnalyticsDataClient(credentials=credentials.to_native_credentials())
# Fetch metadata
metadata = get_metadata(client=client, property_id=property_id)
resource_list = [metadata | metrics_table, metadata | dimensions_table]
# Configure and add resources to the list
for query in queries:
dimensions = query["dimensions"]
if "date" not in dimensions:
dimensions.append("date")
resource_name = query["resource_name"]
resource_list.append(
bigquery_adapter(
Expand All @@ -274,7 +274,7 @@ For a complete picture of Dagster's integration with dlt, please refer to their
partition="date"
)
)
return resource_list
```
Expand Down

0 comments on commit 854905f

Please sign in to comment.