-
Notifications
You must be signed in to change notification settings - Fork 429
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Python 3.11 #700
Comments
Python 3.11 has been released as scheduled on October 25, 2022 and this is the first attempt to see how far Airflow (mostly dependencies) are from being ready to officially support 3.11. So far we had to exclude the following dependencies: - [ ] Pyarrow dependency: apache/arrow#14499 - [ ] Google Provider: #27292 and googleapis/python-bigquery#1386 - [ ] Databricks Provider: databricks/databricks-sql-python#59 - [ ] Papermill Provider: nteract/papermill#700 - [ ] Azure Provider: Azure/azure-uamqp-python#334 and Azure/azure-sdk-for-python#27066 - [ ] Apache Beam Provider: apache/beam#23848 - [ ] Snowflake Provider: snowflakedb/snowflake-connector-python#1294 - [ ] JDBC Provider: jpype-project/jpype#1087 - [ ] Hive Provider: cloudera/python-sasl#30 We might decide to release Airflow in 3.11 with those providers disabled in case they are lagging behind eventually, but for the moment we want to work with all the projects in concert to be able to release all providers (Google Provider requires quite a lot of work and likely Google Team stepping up and community helping with migration to latest Goofle cloud libraries)
Python 3.11 has been released as scheduled on October 25, 2022 and this is the first attempt to see how far Airflow (mostly dependencies) are from being ready to officially support 3.11. So far we had to exclude the following dependencies: - [ ] Pyarrow dependency: apache/arrow#14499 - [ ] Google Provider: #27292 and googleapis/python-bigquery#1386 - [ ] Databricks Provider: databricks/databricks-sql-python#59 - [ ] Papermill Provider: nteract/papermill#700 - [ ] Azure Provider: Azure/azure-uamqp-python#334 and Azure/azure-sdk-for-python#27066 - [ ] Apache Beam Provider: apache/beam#23848 - [ ] Snowflake Provider: snowflakedb/snowflake-connector-python#1294 - [ ] JDBC Provider: jpype-project/jpype#1087 - [ ] Hive Provider: cloudera/python-sasl#30 We might decide to release Airflow in 3.11 with those providers disabled in case they are lagging behind eventually, but for the moment we want to work with all the projects in concert to be able to release all providers (Google Provider requires quite a lot of work and likely Google Team stepping up and community helping with migration to latest Goofle cloud libraries)
Python 3.11 has been released as scheduled on October 25, 2022 and this is the first attempt to see how far Airflow (mostly dependencies) are from being ready to officially support 3.11. So far we had to exclude the following dependencies: - [ ] Pyarrow dependency: apache/arrow#14499 - [ ] Google Provider: #27292 and googleapis/python-bigquery#1386 - [ ] Databricks Provider: databricks/databricks-sql-python#59 - [ ] Papermill Provider: nteract/papermill#700 - [ ] Azure Provider: Azure/azure-uamqp-python#334 and Azure/azure-sdk-for-python#27066 - [ ] Apache Beam Provider: apache/beam#23848 - [ ] Snowflake Provider: snowflakedb/snowflake-connector-python#1294 - [ ] JDBC Provider: jpype-project/jpype#1087 - [ ] Hive Provider: cloudera/python-sasl#30 We might decide to release Airflow in 3.11 with those providers disabled in case they are lagging behind eventually, but for the moment we want to work with all the projects in concert to be able to release all providers (Google Provider requires quite a lot of work and likely Google Team stepping up and community helping with migration to latest Goofle cloud libraries)
Python 3.11 has been released as scheduled on October 25, 2022 and this is the first attempt to see how far Airflow (mostly dependencies) are from being ready to officially support 3.11. So far we had to exclude the following dependencies: - [ ] Pyarrow dependency: apache/arrow#14499 - [ ] Google Provider: #27292 and googleapis/python-bigquery#1386 - [ ] Databricks Provider: databricks/databricks-sql-python#59 - [ ] Papermill Provider: nteract/papermill#700 - [ ] Azure Provider: Azure/azure-uamqp-python#334 and Azure/azure-sdk-for-python#27066 - [ ] Apache Beam Provider: apache/beam#23848 - [ ] Snowflake Provider: snowflakedb/snowflake-connector-python#1294 - [ ] JDBC Provider: jpype-project/jpype#1087 - [ ] Hive Provider: cloudera/python-sasl#30 We might decide to release Airflow in 3.11 with those providers disabled in case they are lagging behind eventually, but for the moment we want to work with all the projects in concert to be able to release all providers (Google Provider requires quite a lot of work and likely Google Team stepping up and community helping with migration to latest Goofle cloud libraries)
Any news? Pyarrow just released 3.11 compatible version yesterday, so I guess papermill will soon become last of the few remaining deps for Airflow. We can also release 3.11 without papermill, but it would be great to have it. |
Hello @potiuk! I don't see |
It's not. It's just showing that others are catching up |
Python 3.11 has been released as scheduled on October 25, 2022 and this is the first attempt to see how far Airflow (mostly dependencies) are from being ready to officially support 3.11. So far we had to exclude the following dependencies: - [ ] Pyarrow dependency: apache/arrow#14499 - [ ] Google Provider: #27292 and googleapis/python-bigquery#1386 - [ ] Databricks Provider: databricks/databricks-sql-python#59 - [ ] Papermill Provider: nteract/papermill#700 - [ ] Azure Provider: Azure/azure-uamqp-python#334 and Azure/azure-sdk-for-python#27066 - [ ] Apache Beam Provider: apache/beam#23848 - [ ] Snowflake Provider: snowflakedb/snowflake-connector-python#1294 - [ ] JDBC Provider: jpype-project/jpype#1087 - [ ] Hive Provider: cloudera/python-sasl#30 We might decide to release Airflow in 3.11 with those providers disabled in case they are lagging behind eventually, but for the moment we want to work with all the projects in concert to be able to release all providers (Google Provider requires quite a lot of work and likely Google Team stepping up and community helping with migration to latest Goofle cloud libraries)
I tried running tests on Python 3.11 and don't see any failures. @potiuk Do you see any errors with respect to Airflow tests using papermill?
|
PR to run CI with Python 3.11 : #708 |
Python 3.11 has been released as scheduled on October 25, 2022 and this is the first attempt to see how far Airflow (mostly dependencies) are from being ready to officially support 3.11. So far we had to exclude the following dependencies: - [ ] Pyarrow dependency: apache/arrow#14499 - [ ] Google Provider: apache#27292 and googleapis/python-bigquery#1386 - [ ] Databricks Provider: databricks/databricks-sql-python#59 - [ ] Papermill Provider: nteract/papermill#700 - [ ] Azure Provider: Azure/azure-uamqp-python#334 and Azure/azure-sdk-for-python#27066 - [ ] Apache Beam Provider: apache/beam#23848 - [ ] Snowflake Provider: snowflakedb/snowflake-connector-python#1294 - [ ] JDBC Provider: jpype-project/jpype#1087 - [ ] Hive Provider: cloudera/python-sasl#30 We might decide to release Airflow in 3.11 with those providers disabled in case they are lagging behind eventually, but for the moment we want to work with all the projects in concert to be able to release all providers (Google Provider requires quite a lot of work and likely Google Team stepping up and community helping with migration to latest Goofle cloud libraries)
Just a bit of a warning here for Papermill maintainers. We are very close to have Python 3.11 support in Apache Airflow - we are just about to merge the Google Provider upgrading ~20 client libraries which was the biggest blocker, and ApacheBeam released 2.47.0 version for Python 3.11. Similarly as it happened alredy with yandex provider - we are going to suspend Papermill provider from our releases if Papermill will be holding us back. We have a process for that described in our process: https://github.com/apache/airflow/blob/main/PROVIDERS.rst#suspending-releases-for-providers - you can also learn there what are the consequences of being suspended (in short - no new releases of the provider until the problem is removed, The first step of the process is to let the maintainers of the provider library that holds us back, which is happending via this comment. Apparently there is a PR already in your repo to prevent it and add 3.11 support, and there is about a week to before we attempt to merge 3.11 charge with suspended Papermill so in case you would like to avoid the suspension, there is a about a week to get Papermill release that will support Python 3.11. |
Also announced in airflow devlist https://lists.apache.org/thread/0dcvjj0f6bnjg3mk4zn32stjbxtprb5j so in case you have something to add, comment etc - feel free. |
ploomber-engine is a drop-in replacement for papermill and it works on Python 3.11: ploomber-engine input.ipynb output.ipynb @potiuk: I'm the author so happy to help with any integrations |
Ah. Nice! Just thinking if maybe we should just a copy of existing papermill provider and do a "ploomber" one. I do see that papermill is pretty inactive so this would be a good idea. Can you please start a discussion about this ("replace the papermill provider with ploomber-engine") at the devlist of Airlfow (see https://lists.apache.org/list.html?dev@airflow.apache.org) also https://airflow.apache.org/community/ has more information on joining the list. We have now pretty formal way of accepting new providers - especially if they are connected to existing services. So we would need to understand what is the relation between ploomber.io and ploomber-engine, can you use it with, or only without it and how much "open" the open-source (BSD licenced I see, which is cool) the engine is. I think we need a bit more context (maybe links to some blogs and explanation why you decided to develop it) and I think we can take it from there. |
Just merged 3.11. FYI. Even if the official 3.11 support is not merged, at least the build and test work for Papermill. so we are not suspending it for now. |
@willingc I see running tests with py3.11 and py3.12, so I guess this has been resolved, right? |
I believe so. We have no exclusions, no report from users (but it might also be because it is not used - it's not the most popular provider of ours (in pypistats it has ~1.5% of airflow downloads) but for all practical purposes it can be closed. |
🚀 Feature
I know it is eaarly (Python 3.11 has just been released yesterday) but we are hoping in Apache Airflow to a much faster cycle of adding new Python releases - especially that Pyhon 3.11 introduces huge performance improvements (25% is the average number claimed) due to a very focused effort to increase single-threaded Python performance (Specialized interpreter being the core of it but also many other improvements) without actually changing any of the Python code.
Motivation
The papermill is one of the dependencies of Airlfow that need to support newer Python version so this issue is here to make you aware that we are eaager (also to help if needed) to make it faster, possibly by talking to dependencies of Beam as well (who are oftten also direct dependencies of Airflow) and help them as well. I perfectly understand we need to bubble up support so that your dependencies support them first.
Nice summary of Py3.11 support is here: https://pyreadiness.org/3.11/ - it's not very green obviously, but I hope it gets greener soon.
I'd appreciate if someone in the ntarct team attempted to migrate and have a PR running (and failing possibly until all prerequisite are met so that we can also track the progress and possibly help each other to solve any problems. I know PyArrow is blocking for now, but this is going to be solved soon lilely with apache/arrow#14499 and just opening a failing PR with it might be a good start.
I just opened such PR in Apache Airflow yesterday and plan to keep it open until it gets green :)
apache/airflow#27264
I think it would be fantastic if we could as the open source community migrate to the new Python much faster.
Looking forward to cooperation on that one :)
The text was updated successfully, but these errors were encountered: