Skip to content

feature: add example links #258

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 28, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 22 additions & 2 deletions packages/doc/docs/extensions/dbt.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -34,9 +34,25 @@ We need to install an additional package to integrate with dbt:

:::

## Using models in VulcanSQL
3. Setup `profiles.yaml`(if you are using DuckDB as the data source for your dbt project)

Using models of dbt is extremely easy, you only need to use the following syntax.
Add `persistent-path` to the `profiles.yaml` in the root of your VulcanSQL project like following

```yaml
- name: duckdb
type: duckdb
connection:
persistent-path: [duckdb db file path of your dbt project]
allow: "*"
```

## Setup of your dbt project

Please refer to the [dbt Quickstarts turotials](https://docs.getdbt.com/quickstarts)

## Using the dbt extension

Using models of dbt is extremely easy, you only need to use the following syntax in your VulcanSQL project.

```sql
{% dbt "model.<project-name>.<model-name>" %}
Expand All @@ -47,3 +63,7 @@ For example, to query all data from model `my_first_dbt_model` in the project `d
```sql
select * from {% dbt "model.demo.my_first_dbt_model" %}
```

## Examples

You can check out this [dbt-jaffle-shop](https://github.com/Canner/vulcan-sql-examples/tree/main/dbt-jaffle-shop) example for further details!
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

The [Table Question Answering](https://huggingface.co/docs/api-inference/detailed_parameters#table-question-answering-task) is one of the Natural Language Processing tasks supported by Hugging Face.

Using the `huggingface_table_question_answering` filter.
## Using the `huggingface_table_question_answering` filter.

The result will be converted to a JSON string from `huggingface_table_question_answering`. You could decompress the JSON string and use the result by itself.

Expand Down Expand Up @@ -63,7 +63,7 @@ SELECT {{ products.value() | huggingface_table_question_answering(query=question
]
```

### Arguments
## Arguments

Please check [Table Question Answering](https://huggingface.co/docs/api-inference/detailed_parameters#table-question-answering-task) for further information.

Expand All @@ -73,3 +73,8 @@ Please check [Table Question Answering](https://huggingface.co/docs/api-inferenc
| model | N | google/tapas-base-finetuned-wtq | The model id of a pretrained model hosted inside a model repo on huggingface.co. See: https://huggingface.co/models?pipeline_tag=table-question-answering |
| use_cache | N | true | There is a cache layer on the inference API to speedup requests we have already seen |
| wait_for_model | N | false | If the model is not ready, wait for it instead of receiving 503. It limits the number of requests required to get your inference done |


## Examples

You can check out this [table-question-answering](https://github.com/Canner/vulcan-sql-examples/tree/main/huggingface/table-question-answering) example for further details!