Skip to content

Commit

Permalink
Fix typo
Browse files Browse the repository at this point in the history
  • Loading branch information
pankajastro committed Dec 11, 2023
1 parent dd9822e commit 7eb8158
Show file tree
Hide file tree
Showing 4 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion python-sdk/docs/astro/sql/operators/dataframe.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
When to use the ``dataframe`` operator
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The ``dataframe`` operator allows you to run Python transformations in Airflow. Behind the scenes, the ``dataframe`` function automatically coverts the source SQL table into a Pandas dataframe, and makes any dataframes resulting from the transformation available to downstream ``astro.sql`` functions. This means you can seamlessly transition between Python and SQL for data transformations without writing any code to explicitly do so. To use the ``dataframe`` operator, you simply provide a Python function that takes a dataframe as one of its inputs, and specify a ``Table`` object as the input SQL table. If you want the resulting dataframe to be converted back to SQL, you can specify an ``output_table`` object.
The ``dataframe`` operator allows you to run Python transformations in Airflow. Behind the scenes, the ``dataframe`` function automatically converts the source SQL table into a Pandas dataframe, and makes any dataframes resulting from the transformation available to downstream ``astro.sql`` functions. This means you can seamlessly transition between Python and SQL for data transformations without writing any code to explicitly do so. To use the ``dataframe`` operator, you simply provide a Python function that takes a dataframe as one of its inputs, and specify a ``Table`` object as the input SQL table. If you want the resulting dataframe to be converted back to SQL, you can specify an ``output_table`` object.

There are two main uses for the ``dataframe`` operator.

Expand Down
2 changes: 1 addition & 1 deletion python-sdk/example_dags/example_load_file.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
- Install dependencies for Astro Python SDK with Google, refer to README.md
- You can either specify a service account key file and set `GOOGLE_APPLICATION_CREDENTIALS`
with the file path to the service account.
- In the connection we need to specfiy the scopes.
- In the connection we need to specify the scopes.
Connection variable is ``extra__google_cloud_default__scope``
or in Airflow Connections UI ``Scopes (comma separated)``
For ex:- https://www.googleapis.com/auth/drive.readonly
Expand Down
2 changes: 1 addition & 1 deletion python-sdk/src/astro/databases/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -844,7 +844,7 @@ def row_count(self, table: BaseTable):

def parameterize_variable(self, variable: str):
"""
While most databases use sqlalchemy, we want to open up how we paramaterize variables for databases
While most databases use sqlalchemy, we want to open up how we parameterize variables for databases
that a) do not use sqlalchemy and b) have different parameterization schemes (namely delta).
:param variable: The variable to parameterize.
Expand Down
2 changes: 1 addition & 1 deletion python-sdk/src/astro/databases/snowflake.py
Original file line number Diff line number Diff line change
Expand Up @@ -1106,7 +1106,7 @@ def is_valid_snow_identifier(name: str) -> bool:
The following method ensures that a string follows the expected identifier syntax.
.. seealso::
`Snowflake official documentation on indentifiers syntax
`Snowflake official documentation on identifiers syntax
<https://docs.snowflake.com/en/sql-reference/identifiers-syntax.html>`_
"""
Expand Down

0 comments on commit 7eb8158

Please sign in to comment.