Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Python 3.11 #59

Closed
potiuk opened this issue Oct 26, 2022 · 7 comments · Fixed by #60
Closed

Support for Python 3.11 #59

potiuk opened this issue Oct 26, 2022 · 7 comments · Fixed by #60

Comments

@potiuk
Copy link

potiuk commented Oct 26, 2022

I know it is eaarly (Python 3.11 has just been released yesterday) but we are hoping in Apache Airflow to a much faster cycle of adding new Python releases - especially that Pyhon 3.11 introduces huge performance improvements (25% is the average number claimed) due to a very focused effort to increase single-threaded Python performance (Specialized interpreter being the core of it but also many other improvements) without actually changing any of the Python code.

The databricks-sql-python is one of the dependencies of Airlfow that need to support newer Python version so this issue is here to make you aware that we are eaager (also to help if needed) to make it faster, possibly by talking to dependencies of Beam as well (who are oftten also direct dependencies of Airflow) and help them as well. I perfectly understand we need to bubble up support so that your dependencies support them first.

Nice summary of Py3.11 support is here: https://pyreadiness.org/3.11/ - it's not very green obviously, but I hope it gets greener soon.

I'd appreciate if someone in the Databricks team attempted to migrate and have a PR running (and failing possibly until all prerequisite are met so that we can also track the progress and possibly help each other to solve any problems. I know PyArrow is blocking for now, but this is going to be solved soon lilely with apache/arrow#14499 and just opening a failing PR with it might be a good start.

I just opened such PR in Apache Airflow yesterday and plan to keep it open until it gets green :)

apache/airflow#27264

I think it would be fantastic if we could as the open source community migrate to the new Python much faster.

Looking forward to cooperation on that one :)

potiuk added a commit to apache/airflow that referenced this issue Oct 26, 2022
Python 3.11 has been released as scheduled on October 25, 2022 and
this is the first attempt to see how far Airflow (mostly dependencies)
are from being ready to officially support 3.11.

So far we had to exclude the following dependencies:

- [ ] Pyarrow dependency: apache/arrow#14499
- [ ] Google Provider: #27292
  and googleapis/python-bigquery#1386
- [ ] Databricks Provider:
  databricks/databricks-sql-python#59
- [ ] Papermill Provider: nteract/papermill#700
- [ ] Azure Provider: Azure/azure-uamqp-python#334
  and Azure/azure-sdk-for-python#27066
- [ ] Apache Beam Provider: apache/beam#23848
- [ ] Snowflake Provider:
  snowflakedb/snowflake-connector-python#1294
- [ ] JDBC Provider: jpype-project/jpype#1087
- [ ] Hive Provider: cloudera/python-sasl#30

We might decide to release Airflow in 3.11 with those providers
disabled in case they are lagging behind eventually, but for the
moment we want to work with all the projects in concert to be
able to release all providers (Google Provider requires quite
a lot of work and likely Google Team stepping up and community helping
with migration to latest Goofle cloud libraries)
@susodapop
Copy link
Contributor

susodapop commented Oct 26, 2022

Thanks @potiuk! I opened a pull request that will run our unit tests against Python 3.11. Assuming everything passes we can bump a new release that can incorporate this ASAP.

@potiuk
Copy link
Author

potiuk commented Oct 26, 2022

Cool!

@susodapop
Copy link
Contributor

@potiuk my PR is blocked until pyarrow has wheels for 3.11. I see you're working with them on making this work. I'll be keeping an eye out. Once pyarrow has wheels for 3.11 I believe we'll be unblocked and ready to support Py3.11

Thanks again for bringing this to our attention and please feel free to ping me if need be.

potiuk added a commit to apache/airflow that referenced this issue Oct 27, 2022
Python 3.11 has been released as scheduled on October 25, 2022 and
this is the first attempt to see how far Airflow (mostly dependencies)
are from being ready to officially support 3.11.

So far we had to exclude the following dependencies:

- [ ] Pyarrow dependency: apache/arrow#14499
- [ ] Google Provider: #27292
  and googleapis/python-bigquery#1386
- [ ] Databricks Provider:
  databricks/databricks-sql-python#59
- [ ] Papermill Provider: nteract/papermill#700
- [ ] Azure Provider: Azure/azure-uamqp-python#334
  and Azure/azure-sdk-for-python#27066
- [ ] Apache Beam Provider: apache/beam#23848
- [ ] Snowflake Provider:
  snowflakedb/snowflake-connector-python#1294
- [ ] JDBC Provider: jpype-project/jpype#1087
- [ ] Hive Provider: cloudera/python-sasl#30

We might decide to release Airflow in 3.11 with those providers
disabled in case they are lagging behind eventually, but for the
moment we want to work with all the projects in concert to be
able to release all providers (Google Provider requires quite
a lot of work and likely Google Team stepping up and community helping
with migration to latest Goofle cloud libraries)
potiuk added a commit to apache/airflow that referenced this issue Oct 27, 2022
Python 3.11 has been released as scheduled on October 25, 2022 and
this is the first attempt to see how far Airflow (mostly dependencies)
are from being ready to officially support 3.11.

So far we had to exclude the following dependencies:

- [ ] Pyarrow dependency: apache/arrow#14499
- [ ] Google Provider: #27292
  and googleapis/python-bigquery#1386
- [ ] Databricks Provider:
  databricks/databricks-sql-python#59
- [ ] Papermill Provider: nteract/papermill#700
- [ ] Azure Provider: Azure/azure-uamqp-python#334
  and Azure/azure-sdk-for-python#27066
- [ ] Apache Beam Provider: apache/beam#23848
- [ ] Snowflake Provider:
  snowflakedb/snowflake-connector-python#1294
- [ ] JDBC Provider: jpype-project/jpype#1087
- [ ] Hive Provider: cloudera/python-sasl#30

We might decide to release Airflow in 3.11 with those providers
disabled in case they are lagging behind eventually, but for the
moment we want to work with all the projects in concert to be
able to release all providers (Google Provider requires quite
a lot of work and likely Google Team stepping up and community helping
with migration to latest Goofle cloud libraries)
potiuk added a commit to apache/airflow that referenced this issue Oct 31, 2022
Python 3.11 has been released as scheduled on October 25, 2022 and
this is the first attempt to see how far Airflow (mostly dependencies)
are from being ready to officially support 3.11.

So far we had to exclude the following dependencies:

- [ ] Pyarrow dependency: apache/arrow#14499
- [ ] Google Provider: #27292
  and googleapis/python-bigquery#1386
- [ ] Databricks Provider:
  databricks/databricks-sql-python#59
- [ ] Papermill Provider: nteract/papermill#700
- [ ] Azure Provider: Azure/azure-uamqp-python#334
  and Azure/azure-sdk-for-python#27066
- [ ] Apache Beam Provider: apache/beam#23848
- [ ] Snowflake Provider:
  snowflakedb/snowflake-connector-python#1294
- [ ] JDBC Provider: jpype-project/jpype#1087
- [ ] Hive Provider: cloudera/python-sasl#30

We might decide to release Airflow in 3.11 with those providers
disabled in case they are lagging behind eventually, but for the
moment we want to work with all the projects in concert to be
able to release all providers (Google Provider requires quite
a lot of work and likely Google Team stepping up and community helping
with migration to latest Goofle cloud libraries)
@susodapop
Copy link
Contributor

Hey @potiuk I again attempted the Python 3.11 but it still fails for the reason that the GH action runner can't build wheels for Python 3.11 and a pre-made wheel isn't available for download. Do you think a 3.11 wheel is forthcoming for pyarrow? Or should we attempt to make the GH runner capable of building pyarrow from scratch?

@susodapop
Copy link
Contributor

Actually, I can answer my own question: I think we can wait for the wheels as the process for the pre-made wheels is detailed in this thread: apache/arrow#14572

@potiuk
Copy link
Author

potiuk commented Nov 3, 2022

Yep. I am subscribed to the thread - they are working on it and making progress. I think half of the data science world holds it's breath now :)

@potiuk
Copy link
Author

potiuk commented Nov 23, 2022

Pyarrow just released 10.0.1 version with pre-compiled binaries for 3.11. I think you should be good to go @susodapop

potiuk added a commit to apache/airflow that referenced this issue Nov 24, 2022
Python 3.11 has been released as scheduled on October 25, 2022 and
this is the first attempt to see how far Airflow (mostly dependencies)
are from being ready to officially support 3.11.

So far we had to exclude the following dependencies:

- [ ] Pyarrow dependency: apache/arrow#14499
- [ ] Google Provider: #27292
  and googleapis/python-bigquery#1386
- [ ] Databricks Provider:
  databricks/databricks-sql-python#59
- [ ] Papermill Provider: nteract/papermill#700
- [ ] Azure Provider: Azure/azure-uamqp-python#334
  and Azure/azure-sdk-for-python#27066
- [ ] Apache Beam Provider: apache/beam#23848
- [ ] Snowflake Provider:
  snowflakedb/snowflake-connector-python#1294
- [ ] JDBC Provider: jpype-project/jpype#1087
- [ ] Hive Provider: cloudera/python-sasl#30

We might decide to release Airflow in 3.11 with those providers
disabled in case they are lagging behind eventually, but for the
moment we want to work with all the projects in concert to be
able to release all providers (Google Provider requires quite
a lot of work and likely Google Team stepping up and community helping
with migration to latest Goofle cloud libraries)
potiuk added a commit to potiuk/airflow that referenced this issue Jan 19, 2023
Python 3.11 has been released as scheduled on October 25, 2022 and
this is the first attempt to see how far Airflow (mostly dependencies)
are from being ready to officially support 3.11.

So far we had to exclude the following dependencies:

- [ ] Pyarrow dependency: apache/arrow#14499
- [ ] Google Provider: apache#27292
  and googleapis/python-bigquery#1386
- [ ] Databricks Provider:
  databricks/databricks-sql-python#59
- [ ] Papermill Provider: nteract/papermill#700
- [ ] Azure Provider: Azure/azure-uamqp-python#334
  and Azure/azure-sdk-for-python#27066
- [ ] Apache Beam Provider: apache/beam#23848
- [ ] Snowflake Provider:
  snowflakedb/snowflake-connector-python#1294
- [ ] JDBC Provider: jpype-project/jpype#1087
- [ ] Hive Provider: cloudera/python-sasl#30

We might decide to release Airflow in 3.11 with those providers
disabled in case they are lagging behind eventually, but for the
moment we want to work with all the projects in concert to be
able to release all providers (Google Provider requires quite
a lot of work and likely Google Team stepping up and community helping
with migration to latest Goofle cloud libraries)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants