Skip to content

Commit

Permalink
docs: add temp_dataset_name to community.BigQueryVectorStore docs (#735)
Browse files Browse the repository at this point in the history
  • Loading branch information
cmenon12 authored Feb 17, 2025
1 parent 22a5e6a commit 6091071
Showing 1 changed file with 5 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,9 @@ class BigQueryVectorStore(BaseBigQueryVectorStore):
`batch_search` method.
Optionally, this class can leverage a Vertex AI Feature Store for online serving
through the `to_vertex_fs_vector_store` method.
Note that the `bigquery.datasets.create permission` is required even if the
dataset already exists. This can be avoided by specifying `temp_dataset_name` as
the name of an existing dataset.
Attributes:
embedding: Embedding model for generating and comparing embeddings.
Expand All @@ -46,6 +49,8 @@ class BigQueryVectorStore(BaseBigQueryVectorStore):
content_field: Name of the column storing document content (default: "content").
embedding_field: Name of the column storing text embeddings (default:
"embedding").
temp_dataset_name: Name of the BigQuery dataset to be used to upload temporary
BQ tables. If None, will default to "{dataset_name}_temp".
doc_id_field: Name of the column storing document IDs (default: "doc_id").
credentials: Optional Google Cloud credentials object.
embedding_dimension: Dimension of the embedding vectors (inferred if not
Expand Down

0 comments on commit 6091071

Please sign in to comment.