Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove incorrect deprecation warning in secrets backend #22326

Merged
merged 1 commit into from
Mar 17, 2022

Conversation

dstandish
Copy link
Contributor

When the no value is found with get_conn_value, the warning was being triggered, even though get_conn_value was implemented and just returned no value (cus there wasn't one).

Now we make the logic a little tighter and only raise the dep warning when get_conn_value not implemented, which is what we intended to do in the first place.

When the no value is found with `get_conn_value`, the warning was being triggered, even though `get_conn_value` was implemented and just returned no value (cus there wasn't one).

Now we make the logic a little tighter and only raise the dep warning when `get_conn_value` not implemented, which is what we intended to do in the first place.
@eladkal
Copy link
Contributor

eladkal commented Mar 17, 2022

I still get the warnings for the example in #19857 (comment)

[2022-03-17, 08:25:32 UTC] {logging_mixin.py:115} WARNING - /opt/***/***/secrets/base_secrets.py:95 PendingDeprecationWarning: This method is deprecated. Please use `***.secrets.environment_variables.EnvironmentVariablesBackend.get_conn_value`.
[2022-03-17, 08:25:32 UTC] {logging_mixin.py:115} WARNING - /opt/***/***/models/connection.py:420 PendingDeprecationWarning: Method `get_conn_uri` is deprecated. Please use `get_conn_value`.

@dstandish
Copy link
Contributor Author

Seems maybe you are using an external secrets backend? I cannot repro. We still need to update other secrets backends. But this fixes the issue where it yes get_conn_value is implemented but just the requested conn isn't there.

@eladkal
Copy link
Contributor

eladkal commented Mar 17, 2022

Seems maybe you are using an external secrets backend? I cannot repro. We still need to update other secrets backends. But this fixes the issue where it yes get_conn_value is implemented but just the requested conn isn't there.

Using breeze ./breeze start-airflow --backend postgres --db-reset

@dstandish
Copy link
Contributor Author

ok will take a look

@dstandish
Copy link
Contributor Author

i ran in breeze... cannot repro. maybe you are on wrong branch or something?

root@ac116578f51e:/opt/airflow# python
Python 3.7.12 (default, Mar  2 2022, 05:37:59)
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from airflow.providers.amazon.aws.hooks.s3 import S3Hook
/opt/airflow/airflow/configuration.py:377: FutureWarning: The 'dag_default_view' setting in [webserver] has the old default value of 'tree'. This value has been changed to 'grid' in the running config, but please update your config before Apache Airflow 3.0.
  FutureWarning,
/opt/airflow/airflow/configuration.py:377: FutureWarning: The 'log_filename_template' setting in [logging] has the old default value of '{{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log'. This value has been changed to 'dag_id={{ ti.dag_id }}/run_id={{ ti.run_id }}/task_id={{ ti.task_id }}/{%% if ti.map_index >= 0 %%}map_index={{ ti.map_index }}/{%% endif %%}attempt={{ try_number }}.log' in the running config, but please update your config before Apache Airflow 3.0.
  FutureWarning,
/opt/airflow/airflow/configuration.py:435: DeprecationWarning: The auth_backend option in [api] has been renamed to auth_backends - the old setting has been used, but please update your config.
  option = self._get_option_from_config_file(deprecated_key, deprecated_section, key, kwargs, section)
/opt/airflow/airflow/configuration.py:377: FutureWarning: The 'auth_backends' setting in [api] has the old default value of 'airflow.api.auth.backend.deny_all'. This value has been changed to 'airflow.api.auth.backend.session' in the running config, but please update your config before Apache Airflow 3.0.
  FutureWarning,
>>> s3_hook = S3Hook('hello')
>>> s3_hook.load_file(filename='bla', key='blaa', bucket_name='fff')
/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/relationships.py:3463 SAWarning: relationship 'RenderedTaskInstanceFields.dag_run' will copy column rendered_task_instance_fields.dag_id to column dag_run.dag_id, which conflicts with relationship(s): 'BaseXCom.dag_run' (copies xcom.dag_id to dag_run.dag_id). If this is not the intention, consider if these relationships should be linked with back_populates, or if viewonly=True should be applied to one or more if they are read-only. For the less common case that foreign key constraints are partially overlapping, the orm.foreign() annotation can be used to isolate the columns that should be written towards.   The 'overlaps' parameter may be used to remove this warning. (Background on this error at: http://sqlalche.me/e/14/qzyx)
/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/relationships.py:3463 SAWarning: relationship 'RenderedTaskInstanceFields.dag_run' will copy column rendered_task_instance_fields.run_id to column dag_run.run_id, which conflicts with relationship(s): 'BaseXCom.dag_run' (copies xcom.run_id to dag_run.run_id). If this is not the intention, consider if these relationships should be linked with back_populates, or if viewonly=True should be applied to one or more if they are read-only. For the less common case that foreign key constraints are partially overlapping, the orm.foreign() annotation can be used to isolate the columns that should be written towards.   The 'overlaps' parameter may be used to remove this warning. (Background on this error at: http://sqlalche.me/e/14/qzyx)
[2022-03-17 17:07:57,825] {base_aws.py:447} WARNING - Unable to use Airflow Connection for credentials.
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/airflow/airflow/providers/amazon/aws/hooks/s3.py", line 63, in wrapper
    return func(*bound_args.args, **bound_args.kwargs)
  File "/opt/airflow/airflow/providers/amazon/aws/hooks/s3.py", line 91, in wrapper
    return func(*bound_args.args, **bound_args.kwargs)
  File "/opt/airflow/airflow/providers/amazon/aws/hooks/s3.py", line 525, in load_file
    if not replace and self.check_for_key(key, bucket_name):
  File "/opt/airflow/airflow/providers/amazon/aws/hooks/s3.py", line 63, in wrapper
    return func(*bound_args.args, **bound_args.kwargs)
  File "/opt/airflow/airflow/providers/amazon/aws/hooks/s3.py", line 91, in wrapper
    return func(*bound_args.args, **bound_args.kwargs)
  File "/opt/airflow/airflow/providers/amazon/aws/hooks/s3.py", line 363, in check_for_key
    self.get_conn().head_object(Bucket=bucket_name, Key=key)
  File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 395, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 712, in _make_api_call
    operation_model, request_dict, request_context)
  File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 731, in _make_request
    return self._endpoint.make_request(operation_model, request_dict)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 107, in make_request
    return self._send_request(request_dict, operation_model)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 180, in _send_request
    request = self.create_request(request_dict, operation_model)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 121, in create_request
    operation_name=operation_model.name)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 357, in emit
    return self._emitter.emit(aliased_event_name, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 228, in emit
    return self._emit(event_name, kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 211, in _emit
    response = handler(**kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/signers.py", line 93, in handler
    return self.sign(operation_name, request)
  File "/usr/local/lib/python3.7/site-packages/botocore/signers.py", line 165, in sign
    auth.add_auth(request)
  File "/usr/local/lib/python3.7/site-packages/botocore/auth.py", line 388, in add_auth
    raise NoCredentialsError()
botocore.exceptions.NoCredentialsError: Unable to locate credentials

Copy link
Contributor

@eladkal eladkal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This must have been something on my env then. It works when I try from python shell.

Logic looks good to me.

@github-actions github-actions bot added the full tests needed We need to run full set of tests for this PR to merge label Mar 17, 2022
@github-actions
Copy link

The PR most likely needs to run full matrix of tests because it modifies parts of the core of Airflow. However, committers might decide to merge it quickly and take the risk. If they don't merge it quickly - please rebase it to the latest main at your convenience, or amend the last commit of the PR, and push it with --force-with-lease.

@dstandish dstandish merged commit 4f1dcdd into apache:main Mar 17, 2022
@dstandish dstandish deleted the json-conn-overeager-dep-warning branch March 17, 2022 19:20
@ephraimbuddy ephraimbuddy added the type:bug-fix Changelog: Bug Fixes label Apr 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:secrets full tests needed We need to run full set of tests for this PR to merge type:bug-fix Changelog: Bug Fixes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants