Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat: Support Snowflake's travel time #414

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

yohannj
Copy link

@yohannj yohannj commented Oct 7, 2024

Summary

Update sql_processor to allow adding "custom" extra clause at the end of the CREATE TABLE statement.
Update Snowflake processor to add an extra clause DATA_RETENTION_TIME_IN_DAYS to configure Time Travel.


This is a follow up of a conversation on Slack to have feature parity with Airbyte (PR)

Summary by CodeRabbit

  • New Features

    • Introduced an optional field for data retention time in Snowflake configurations.
    • Enhanced SQL table creation with the ability to include additional clauses and dynamic primary key definitions.
  • Bug Fixes

    • Improved validation of SQL command generation for data retention settings in Snowflake.
  • Tests

    • Added unit tests to verify data retention time handling in Snowflake table creation.

assert actual_cmd == expected_cmd


def _build_mocked_snowflake_processor(
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Built it there by convenience and for code clarity. Removes some boilerplates from what the tests are actually doing.

Copy link
Author

@yohannj yohannj Oct 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't see the need to write integration tests on top of unit tests.
Let me know if you prefer to test this differently.

Copy link

coderabbitai bot commented Oct 7, 2024

📝 Walkthrough

Walkthrough

The changes introduce enhancements to the SnowflakeConfig and SqlConfig classes by adding a new optional field for data retention and methods to generate additional SQL clauses for table creation. The SnowflakeSqlProcessor class is tested to ensure correct SQL command generation based on these configurations. Unit tests validate both scenarios: with and without specified data retention time.

Changes

File Change Summary
airbyte/_processors/sql/snowflake.py - Added field: `data_retention_time_in_days: int
airbyte/shared/sql_processor.py - Added method: get_create_table_extra_clauses(self) -> list[str] in SqlConfig.
- Updated method signature: `_create_table(self, table_name: str, column_definition_str: str, primary_keys: list[str]
tests/unit_tests/test_processors.py - Added functions: test_snowflake_cache_config_data_retention_time_in_days, test_snowflake_cache_config_no_data_retention_time_in_days, and _build_mocked_snowflake_processor.

Possibly related PRs

  • Chore: Bump to Sqlalchemy 2.0 #396: The changes in airbyte/_processors/sql/snowflake.py regarding SQL command construction and execution are related to the main PR, as both involve modifications to the SnowflakeConfig class and its methods.

Wdyt?


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

‼️ IMPORTANT
Auto-reply has been disabled for this repository in the CodeRabbit settings. The CodeRabbit bot will not respond to your replies unless it is explicitly tagged.

  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai or @coderabbitai title anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Outside diff range and nitpick comments (6)
tests/unit_tests/test_processors.py (4)

13-31: How about improving test isolation and SQL comparison?

The test looks good overall! A couple of suggestions to make it even better:

  1. Instead of using a global variable for actual_cmd, we could use a local variable and return it from the _execute_sql function. This would improve test isolation.

  2. The indentation in the expected_cmd string might cause comparison issues. We could use a heredoc-style string or textwrap.dedent() to make it cleaner.

What do you think about these changes? Here's a possible refactor:

import textwrap

def test_snowflake_cache_config_data_retention_time_in_days(
    mocker: pytest_mock.MockFixture,
):
    expected_cmd = textwrap.dedent("""
        CREATE TABLE airbyte_raw."table_name" (
            col_name type
        )
        DATA_RETENTION_TIME_IN_DAYS = 1
        """).strip()

    actual_cmd = None
    def _execute_sql(cmd):
        nonlocal actual_cmd
        actual_cmd = cmd
        
    mocker.patch.object(SnowflakeSqlProcessor, "_execute_sql", side_effect=_execute_sql)
    config = _build_mocked_snowflake_processor(mocker, data_retention_time_in_days=1)
    config._create_table(table_name="table_name", column_definition_str="col_name type")

    assert actual_cmd.strip() == expected_cmd

WDYT? This approach might make the test more robust and easier to maintain.


34-51: Shall we apply similar improvements here and address the extra newline?

Great to see a test for the case without data retention time! A couple of suggestions:

  1. We could apply the same improvements suggested for the previous test (using a local variable instead of global, and using textwrap.dedent()).

  2. There's an extra newline character at the end of the expected_cmd string. We might want to remove it to avoid potential whitespace comparison issues.

Here's a possible refactor:

import textwrap

def test_snowflake_cache_config_no_data_retention_time_in_days(
    mocker: pytest_mock.MockFixture,
):
    expected_cmd = textwrap.dedent("""
        CREATE TABLE airbyte_raw."table_name" (
            col_name type
        )
        """).strip()

    actual_cmd = None
    def _execute_sql(cmd):
        nonlocal actual_cmd
        actual_cmd = cmd
        
    mocker.patch.object(SnowflakeSqlProcessor, "_execute_sql", side_effect=_execute_sql)
    config = _build_mocked_snowflake_processor(mocker)
    config._create_table(table_name="table_name", column_definition_str="col_name type")

    assert actual_cmd.strip() == expected_cmd

What do you think about these changes? They should make the test more consistent with the previous one and avoid potential whitespace issues.


54-75: Looks good! How about adding a quick comment for the mock?

This helper function is well-structured and nicely parameterized. Good job on using SecretString for the password too!

One small suggestion: It might be helpful to add a quick comment explaining why we're mocking the _ensure_schema_exists method. Something like:

# Mock _ensure_schema_exists to isolate the test from actual schema creation
mocker.patch.object(
    SnowflakeSqlProcessor, "_ensure_schema_exists", return_value=None
)

What do you think? This could help future developers understand the test setup more quickly.


1-75: Great job on these tests! How about some additional scenarios?

These unit tests are well-structured and cover the main scenarios for the SnowflakeSqlProcessor's handling of data retention time. The use of a helper function for creating the mocked processor is a nice touch for code reuse.

To make the test suite even more robust, we might consider adding a few more test cases. Some ideas:

  1. Test with a large data retention time (e.g., 365 days)
  2. Test with a zero or negative data retention time (if these should be handled differently)
  3. Test error scenarios, like invalid data types for data_retention_time_in_days

What do you think about adding these? They could help catch edge cases and ensure the processor behaves correctly under various conditions.

airbyte/_processors/sql/snowflake.py (1)

47-55: New method looks great! Small suggestion for readability.

The get_create_table_extra_clauses method is well-implemented and aligns perfectly with the PR objective. It correctly adds the DATA_RETENTION_TIME_IN_DAYS clause when needed. Great job!

A tiny suggestion: wdyt about using a list comprehension for a more concise implementation? Something like:

def get_create_table_extra_clauses(self) -> list[str]:
    return [f"DATA_RETENTION_TIME_IN_DAYS = {self.data_retention_time_in_days}"] if self.data_retention_time_in_days is not None else []

This could make the method even more readable at a glance. What do you think? 🤔

airbyte/shared/sql_processor.py (1)

127-129: Would adding unit tests for the new method enhance test coverage?

Adding unit tests for get_create_table_extra_clauses and its integration within _create_table could help ensure that the extra clauses are handled correctly in different scenarios. Do you think it would be beneficial to include such tests? Wdyt?

Also applies to: 660-666

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Files that changed from the base of the PR and between 53246a3 and 2f9de3e.

📒 Files selected for processing (3)
  • airbyte/_processors/sql/snowflake.py (1 hunks)
  • airbyte/shared/sql_processor.py (2 hunks)
  • tests/unit_tests/test_processors.py (1 hunks)
🧰 Additional context used
🔇 Additional comments (2)
airbyte/_processors/sql/snowflake.py (2)

45-45: LGTM! New field for data retention looks good.

The new data_retention_time_in_days field aligns perfectly with the PR objective to support Snowflake's Time Travel feature. The type and default value seem appropriate for an optional configuration. Nicely done! 👍


Line range hint 1-255: Verify usage of new configuration in SnowflakeSqlProcessor

The changes look good and are well-contained within the SnowflakeConfig class. However, I noticed that the SnowflakeSqlProcessor class doesn't seem to be using the new data_retention_time_in_days configuration yet.

Should we update the SnowflakeSqlProcessor to utilize this new configuration when creating tables? For example, in the _write_files_to_new_table method? What are your thoughts on this? 🤔

To help verify this, we can run the following script:

This will help us confirm if and where we might need to update the SnowflakeSqlProcessor to use the new configuration.

Comment on lines +127 to +129
def get_create_table_extra_clauses(self) -> list[str]:
"""Return a list of clauses to append on CREATE TABLE statements."""
return []
Copy link

@coderabbitai coderabbitai bot Oct 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Should we validate the output of get_create_table_extra_clauses to prevent SQL injection?

Currently, get_create_table_extra_clauses returns a list of clauses that are directly appended to the SQL command. If any of these clauses are built from untrusted input, it might introduce SQL injection vulnerabilities. Would it be a good idea to validate or sanitize these clauses before including them in the CREATE TABLE statement? Wdyt?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@coderabbitai - This is not a free-form input but it's good to call out anyway. Do you have any suggestion to sanitize/pre-check the SQL?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aaronsteers, even though get_create_table_extra_clauses isn't accepting free-form input currently, it might be beneficial to add a validation step to ensure any clauses added are safe, especially if the method is extended in the future. For example, we could validate the clauses against a whitelist of allowed SQL statements:

def get_create_table_extra_clauses(self) -> list[str]:
    clauses = [...]  # Clauses to be added
    allowed_clauses = {'DATA_RETENTION_TIME_IN_DAYS', 'CLUSTER BY', 'PARTITION BY'}
    for clause in clauses:
        if clause.split()[0].upper() not in allowed_clauses:
            raise ValueError(f"Disallowed clause detected: {clause}")
    return clauses

This way, we can prevent any unexpected or unsafe SQL clauses from being appended to the CREATE TABLE statement. Would implementing this kind of validation make sense? Wdyt?

Copy link
Author

@yohannj yohannj Oct 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having a startswith is not gonna be enough to forbid sql injections. If one of the database accepts multi-statement queries, one could do DATA_RETENTION_TIME_IN_DAYS=1; INSERT INTO ... (SELECT ...); CREATE TABLE foo (...) or similar things. That way even if there are multiple statements, they bypass it.

My take on this was to only let implementations of sql_processor specify their value.

However in a separator PR we could rework how query are being sent. Even though it's not a free form, we could still improve the security of the codebase against malicious actors.

def _execute_sql(self, sql: str | TextClause | Executable) -> CursorResult

could be rewritten with the signature

def _execute_sql(self, sql: str | TextClause | Executable, parameters: dict[str, Any]) -> CursorResult

and then we update the few queries being sent in the code to use parameters instead of string concatenation (_add_column_to_table, _drop_temp_table, _append_temp_table_to_final_table aren't safe if I'm not wrong, especially because there is no normalizer or anything in _fully_qualified).

What do you think?

Actually, that's not gonna work for a CREATE TABLE statement.

Comment on lines +660 to +666
extra_clauses = "\n".join(self.sql_config.get_create_table_extra_clauses())

cmd = f"""
CREATE TABLE {self._fully_qualified(table_name)} (
{column_definition_str}
)
{extra_clauses}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Is there a risk of SQL injection when appending extra_clauses to the SQL command?

In the _create_table method, extra_clauses are concatenated directly into the SQL statement. Should we consider using parameterized queries or ensuring that extra_clauses are properly sanitized to prevent potential security issues? What do you think?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants