-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test Comment bot #16
Test Comment bot #16
Conversation
comment |
4 similar comments
comment |
comment |
comment |
comment |
@ursabot --help |
@ursabot crossbow --help |
@ursabot --help |
@ursabot crossbow --help |
1 similar comment
@ursabot crossbow --help |
1 similar comment
@ursabot crossbow --help |
|
|
@ursabot crossbow submit -g wheel |
4 similar comments
@ursabot crossbow submit -g wheel |
@ursabot crossbow submit -g wheel |
@ursabot crossbow submit -g wheel |
@ursabot crossbow submit -g wheel |
@ursabot crossbow submit -g wheel |
@ursabot crossbow submit wheel-manylinux1-cp37m |
1 similar comment
@ursabot crossbow submit wheel-manylinux1-cp37m |
Revision: deb857f Submitted crossbow builds: ursa-labs/crossbow @ build-744
|
@github-actions |
@github-actions crossbow submit -g conda |
Revision: a82b879 Submitted crossbow builds: ursa-labs/crossbow @ build-745 |
@github-actions crossbow submit wheel-manylinux1-cp37m |
Revision: 33f65d4 Submitted crossbow builds: ursa-labs/crossbow @ build-746
|
See it in action: kszucs#16 (comment) Main drawback that is is much slower than ursabot, but we can optimize it by: - porting crossbow to only depend on pygithub instead of libgit2 (will consume the rate limit, but should fit in) - use caching or docker Theoretically CROSSBOW_GITHUB_TOKEN is set as a github actions secret, see https://issues.apache.org/jira/browse/INFRA-19954 We can trigger a build once this is merged into master. Closes #6571 from kszucs/master and squashes the following commits: 7a604a8 <Krisztián Szűcs> note that the license is BSD2 8586eb7 <Krisztián Szűcs> add license reference def8724 <Krisztián Szűcs> RAT a96e7e2 <Krisztián Szűcs> flake8 6f5da63 <Krisztián Szűcs> add requirements to docker whitelist 6678c2e <Krisztián Szűcs> update archery dependencies 33f65d4 <Krisztián Szűcs> revert removing the rest of the workflows a82b879 <Krisztián Szűcs> test dep 06a7716 <Krisztián Szűcs> responses test dep ba25229 <Krisztián Szűcs> fix archery workflow syntax 9352ee0 <Krisztián Szűcs> run archery unittests deb857f <Krisztián Szűcs> checkout@v2 and fetch tags 215495a <Krisztián Szűcs> fix result path 748832f <Krisztián Szűcs> message formatter ea1b7c8 <Krisztián Szűcs> no dry run 6c83b0c <Krisztián Szűcs> dry run 4789ac5 <Krisztián Szűcs> response ormatter 1b0b15d <Krisztián Szűcs> cleanup 2270a35 <Krisztián Szűcs> validate 035024f <Krisztián Szűcs> validate callback e791c62 <Krisztián Szűcs> diag 641227f <Krisztián Szűcs> diab b22b204 <Krisztián Szűcs> token d95e86b <Krisztián Szűcs> path to event payload 3e9a279 <Krisztián Szűcs> pygithub ca1592d <Krisztián Szűcs> typo 3c1358e <Krisztián Szűcs> triger event handler 55e65fa <Krisztián Szűcs> crossbow command 92568eb <Krisztián Szűcs> first draft of bot 99ea0c2 <Krisztián Szűcs> cat 3c0f16d <Krisztián Szűcs> remove all other workflows 1f8f21d <Krisztián Szűcs> diag event handling 2f613dd <Krisztián Szűcs> Check event handling (#15) Authored-by: Krisztián Szűcs <szucs.krisztian@gmail.com> Signed-off-by: Krisztián Szűcs <szucs.krisztian@gmail.com>
31aaa5b
to
25e8c2b
Compare
This PR enables tests for `ARROW_COMPUTE`, `ARROW_DATASET`, `ARROW_FILESYSTEM`, `ARROW_HDFS`, `ARROW_ORC`, and `ARROW_IPC` (default on). apache#7131 enabled a minimal set of tests as a starting point. I confirmed that these tests pass locally with the current master. In the current TravisCI environment, we cannot see this result due to a lot of error messages in `arrow-utility-test`. ``` $ git log | head -1 commit ed5f534 % ctest ... Start 1: arrow-array-test 1/51 Test #1: arrow-array-test ..................... Passed 4.62 sec Start 2: arrow-buffer-test 2/51 Test #2: arrow-buffer-test .................... Passed 0.14 sec Start 3: arrow-extension-type-test 3/51 Test #3: arrow-extension-type-test ............ Passed 0.12 sec Start 4: arrow-misc-test 4/51 Test #4: arrow-misc-test ...................... Passed 0.14 sec Start 5: arrow-public-api-test 5/51 Test #5: arrow-public-api-test ................ Passed 0.12 sec Start 6: arrow-scalar-test 6/51 Test #6: arrow-scalar-test .................... Passed 0.13 sec Start 7: arrow-type-test 7/51 Test #7: arrow-type-test ...................... Passed 0.14 sec Start 8: arrow-table-test 8/51 Test #8: arrow-table-test ..................... Passed 0.13 sec Start 9: arrow-tensor-test 9/51 Test #9: arrow-tensor-test .................... Passed 0.13 sec Start 10: arrow-sparse-tensor-test 10/51 Test #10: arrow-sparse-tensor-test ............. Passed 0.16 sec Start 11: arrow-stl-test 11/51 Test #11: arrow-stl-test ....................... Passed 0.12 sec Start 12: arrow-concatenate-test 12/51 Test #12: arrow-concatenate-test ............... Passed 0.53 sec Start 13: arrow-diff-test 13/51 Test #13: arrow-diff-test ...................... Passed 1.45 sec Start 14: arrow-c-bridge-test 14/51 Test #14: arrow-c-bridge-test .................. Passed 0.18 sec Start 15: arrow-io-buffered-test 15/51 Test #15: arrow-io-buffered-test ............... Passed 0.20 sec Start 16: arrow-io-compressed-test 16/51 Test #16: arrow-io-compressed-test ............. Passed 3.48 sec Start 17: arrow-io-file-test 17/51 Test #17: arrow-io-file-test ................... Passed 0.74 sec Start 18: arrow-io-hdfs-test 18/51 Test #18: arrow-io-hdfs-test ................... Passed 0.12 sec Start 19: arrow-io-memory-test 19/51 Test #19: arrow-io-memory-test ................. Passed 2.77 sec Start 20: arrow-utility-test 20/51 Test apache#20: arrow-utility-test ...................***Failed 5.65 sec Start 21: arrow-threading-utility-test 21/51 Test apache#21: arrow-threading-utility-test ......... Passed 1.34 sec Start 22: arrow-compute-compute-test 22/51 Test apache#22: arrow-compute-compute-test ........... Passed 0.13 sec Start 23: arrow-compute-boolean-test 23/51 Test apache#23: arrow-compute-boolean-test ........... Passed 0.15 sec Start 24: arrow-compute-cast-test 24/51 Test apache#24: arrow-compute-cast-test .............. Passed 0.22 sec Start 25: arrow-compute-hash-test 25/51 Test apache#25: arrow-compute-hash-test .............. Passed 2.61 sec Start 26: arrow-compute-isin-test 26/51 Test apache#26: arrow-compute-isin-test .............. Passed 0.81 sec Start 27: arrow-compute-match-test 27/51 Test apache#27: arrow-compute-match-test ............. Passed 0.40 sec Start 28: arrow-compute-sort-to-indices-test 28/51 Test apache#28: arrow-compute-sort-to-indices-test ... Passed 3.33 sec Start 29: arrow-compute-nth-to-indices-test 29/51 Test apache#29: arrow-compute-nth-to-indices-test .... Passed 1.51 sec Start 30: arrow-compute-util-internal-test 30/51 Test apache#30: arrow-compute-util-internal-test ..... Passed 0.13 sec Start 31: arrow-compute-add-test 31/51 Test apache#31: arrow-compute-add-test ............... Passed 0.12 sec Start 32: arrow-compute-aggregate-test 32/51 Test apache#32: arrow-compute-aggregate-test ......... Passed 14.70 sec Start 33: arrow-compute-compare-test 33/51 Test apache#33: arrow-compute-compare-test ........... Passed 7.96 sec Start 34: arrow-compute-take-test 34/51 Test apache#34: arrow-compute-take-test .............. Passed 4.80 sec Start 35: arrow-compute-filter-test 35/51 Test apache#35: arrow-compute-filter-test ............ Passed 8.23 sec Start 36: arrow-dataset-dataset-test 36/51 Test apache#36: arrow-dataset-dataset-test ........... Passed 0.25 sec Start 37: arrow-dataset-discovery-test 37/51 Test apache#37: arrow-dataset-discovery-test ......... Passed 0.13 sec Start 38: arrow-dataset-file-ipc-test 38/51 Test apache#38: arrow-dataset-file-ipc-test .......... Passed 0.21 sec Start 39: arrow-dataset-file-test 39/51 Test apache#39: arrow-dataset-file-test .............. Passed 0.12 sec Start 40: arrow-dataset-filter-test 40/51 Test apache#40: arrow-dataset-filter-test ............ Passed 0.16 sec Start 41: arrow-dataset-partition-test 41/51 Test apache#41: arrow-dataset-partition-test ......... Passed 0.13 sec Start 42: arrow-dataset-scanner-test 42/51 Test apache#42: arrow-dataset-scanner-test ........... Passed 0.20 sec Start 43: arrow-filesystem-test 43/51 Test apache#43: arrow-filesystem-test ................ Passed 1.62 sec Start 44: arrow-hdfs-test 44/51 Test apache#44: arrow-hdfs-test ...................... Passed 0.13 sec Start 45: arrow-feather-test 45/51 Test apache#45: arrow-feather-test ................... Passed 0.91 sec Start 46: arrow-ipc-read-write-test 46/51 Test apache#46: arrow-ipc-read-write-test ............ Passed 5.77 sec Start 47: arrow-ipc-json-simple-test 47/51 Test apache#47: arrow-ipc-json-simple-test ........... Passed 0.16 sec Start 48: arrow-ipc-json-test 48/51 Test apache#48: arrow-ipc-json-test .................. Passed 0.27 sec Start 49: arrow-json-integration-test 49/51 Test apache#49: arrow-json-integration-test .......... Passed 0.13 sec Start 50: arrow-json-test 50/51 Test apache#50: arrow-json-test ...................... Passed 0.26 sec Start 51: arrow-orc-adapter-test 51/51 Test apache#51: arrow-orc-adapter-test ............... Passed 1.92 sec 98% tests passed, 1 tests failed out of 51 Label Time Summary: arrow-tests = 27.38 sec (27 tests) arrow_compute = 45.11 sec (14 tests) arrow_dataset = 1.21 sec (7 tests) arrow_ipc = 6.20 sec (3 tests) unittest = 79.91 sec (51 tests) Total Test time (real) = 79.99 sec The following tests FAILED: 20 - arrow-utility-test (Failed) Errors while running CTest ``` Closes apache#7142 from kiszk/ARROW-8754 Authored-by: Kazuaki Ishizaki <ishizaki@jp.ibm.com> Signed-off-by: Sutou Kouhei <kou@clear-code.com>
2033b64
to
6103fe8
Compare
See it in action: kszucs/arrow#16 (comment) Main drawback that is is much slower than ursabot, but we can optimize it by: - porting crossbow to only depend on pygithub instead of libgit2 (will consume the rate limit, but should fit in) - use caching or docker Theoretically CROSSBOW_GITHUB_TOKEN is set as a github actions secret, see https://issues.apache.org/jira/browse/INFRA-19954 We can trigger a build once this is merged into master. Closes #6571 from kszucs/master and squashes the following commits: 7a604a875 <Krisztián Szűcs> note that the license is BSD2 8586eb727 <Krisztián Szűcs> add license reference def872434 <Krisztián Szűcs> RAT a96e7e24c <Krisztián Szűcs> flake8 6f5da639d <Krisztián Szűcs> add requirements to docker whitelist 6678c2e0a <Krisztián Szűcs> update archery dependencies 33f65d48c <Krisztián Szűcs> revert removing the rest of the workflows a82b8790a <Krisztián Szűcs> test dep 06a7716cb <Krisztián Szűcs> responses test dep ba2522989 <Krisztián Szűcs> fix archery workflow syntax 9352ee05c <Krisztián Szűcs> run archery unittests deb857ff1 <Krisztián Szűcs> checkout@v2 and fetch tags 215495a3d <Krisztián Szűcs> fix result path 748832f75 <Krisztián Szűcs> message formatter ea1b7c863 <Krisztián Szűcs> no dry run 6c83b0c40 <Krisztián Szűcs> dry run 4789ac5d5 <Krisztián Szűcs> response ormatter 1b0b15d5a <Krisztián Szűcs> cleanup 2270a35a9 <Krisztián Szűcs> validate 035024fa0 <Krisztián Szűcs> validate callback e791c627b <Krisztián Szűcs> diag 641227f73 <Krisztián Szűcs> diab b22b20400 <Krisztián Szűcs> token d95e86b7b <Krisztián Szűcs> path to event payload 3e9a27909 <Krisztián Szűcs> pygithub ca1592d5d <Krisztián Szűcs> typo 3c1358eff <Krisztián Szűcs> triger event handler 55e65faf3 <Krisztián Szűcs> crossbow command 92568eb5e <Krisztián Szűcs> first draft of bot 99ea0c2b5 <Krisztián Szűcs> cat 3c0f16d83 <Krisztián Szűcs> remove all other workflows 1f8f21de9 <Krisztián Szűcs> diag event handling 2f613dd15 <Krisztián Szűcs> Check event handling (#15) Authored-by: Krisztián Szűcs <szucs.krisztian@gmail.com> Signed-off-by: Krisztián Szűcs <szucs.krisztian@gmail.com>
See it in action: kszucs/arrow#16 (comment) Main drawback that is is much slower than ursabot, but we can optimize it by: - porting crossbow to only depend on pygithub instead of libgit2 (will consume the rate limit, but should fit in) - use caching or docker Theoretically CROSSBOW_GITHUB_TOKEN is set as a github actions secret, see https://issues.apache.org/jira/browse/INFRA-19954 We can trigger a build once this is merged into master. Closes #6571 from kszucs/master and squashes the following commits: 7a604a875 <Krisztián Szűcs> note that the license is BSD2 8586eb727 <Krisztián Szűcs> add license reference def872434 <Krisztián Szűcs> RAT a96e7e24c <Krisztián Szűcs> flake8 6f5da639d <Krisztián Szűcs> add requirements to docker whitelist 6678c2e0a <Krisztián Szűcs> update archery dependencies 33f65d48c <Krisztián Szűcs> revert removing the rest of the workflows a82b8790a <Krisztián Szűcs> test dep 06a7716cb <Krisztián Szűcs> responses test dep ba2522989 <Krisztián Szűcs> fix archery workflow syntax 9352ee05c <Krisztián Szűcs> run archery unittests deb857ff1 <Krisztián Szűcs> checkout@v2 and fetch tags 215495a3d <Krisztián Szűcs> fix result path 748832f75 <Krisztián Szűcs> message formatter ea1b7c863 <Krisztián Szűcs> no dry run 6c83b0c40 <Krisztián Szűcs> dry run 4789ac5d5 <Krisztián Szűcs> response ormatter 1b0b15d5a <Krisztián Szűcs> cleanup 2270a35a9 <Krisztián Szűcs> validate 035024fa0 <Krisztián Szűcs> validate callback e791c627b <Krisztián Szűcs> diag 641227f73 <Krisztián Szűcs> diab b22b20400 <Krisztián Szűcs> token d95e86b7b <Krisztián Szűcs> path to event payload 3e9a27909 <Krisztián Szűcs> pygithub ca1592d5d <Krisztián Szűcs> typo 3c1358eff <Krisztián Szűcs> triger event handler 55e65faf3 <Krisztián Szűcs> crossbow command 92568eb5e <Krisztián Szűcs> first draft of bot 99ea0c2b5 <Krisztián Szűcs> cat 3c0f16d83 <Krisztián Szűcs> remove all other workflows 1f8f21de9 <Krisztián Szűcs> diag event handling 2f613dd15 <Krisztián Szűcs> Check event handling (#15) Authored-by: Krisztián Szűcs <szucs.krisztian@gmail.com> Signed-off-by: Krisztián Szűcs <szucs.krisztian@gmail.com>
See it in action: kszucs/arrow#16 (comment) Main drawback that is is much slower than ursabot, but we can optimize it by: - porting crossbow to only depend on pygithub instead of libgit2 (will consume the rate limit, but should fit in) - use caching or docker Theoretically CROSSBOW_GITHUB_TOKEN is set as a github actions secret, see https://issues.apache.org/jira/browse/INFRA-19954 We can trigger a build once this is merged into master. Closes #6571 from kszucs/master and squashes the following commits: 7a604a875 <Krisztián Szűcs> note that the license is BSD2 8586eb727 <Krisztián Szűcs> add license reference def872434 <Krisztián Szűcs> RAT a96e7e24c <Krisztián Szűcs> flake8 6f5da639d <Krisztián Szűcs> add requirements to docker whitelist 6678c2e0a <Krisztián Szűcs> update archery dependencies 33f65d48c <Krisztián Szűcs> revert removing the rest of the workflows a82b8790a <Krisztián Szűcs> test dep 06a7716cb <Krisztián Szűcs> responses test dep ba2522989 <Krisztián Szűcs> fix archery workflow syntax 9352ee05c <Krisztián Szűcs> run archery unittests deb857ff1 <Krisztián Szűcs> checkout@v2 and fetch tags 215495a3d <Krisztián Szűcs> fix result path 748832f75 <Krisztián Szűcs> message formatter ea1b7c863 <Krisztián Szűcs> no dry run 6c83b0c40 <Krisztián Szűcs> dry run 4789ac5d5 <Krisztián Szűcs> response ormatter 1b0b15d5a <Krisztián Szűcs> cleanup 2270a35a9 <Krisztián Szűcs> validate 035024fa0 <Krisztián Szűcs> validate callback e791c627b <Krisztián Szűcs> diag 641227f73 <Krisztián Szűcs> diab b22b20400 <Krisztián Szűcs> token d95e86b7b <Krisztián Szűcs> path to event payload 3e9a27909 <Krisztián Szűcs> pygithub ca1592d5d <Krisztián Szűcs> typo 3c1358eff <Krisztián Szűcs> triger event handler 55e65faf3 <Krisztián Szűcs> crossbow command 92568eb5e <Krisztián Szűcs> first draft of bot 99ea0c2b5 <Krisztián Szűcs> cat 3c0f16d83 <Krisztián Szűcs> remove all other workflows 1f8f21de9 <Krisztián Szűcs> diag event handling 2f613dd15 <Krisztián Szűcs> Check event handling (#15) Authored-by: Krisztián Szűcs <szucs.krisztian@gmail.com> Signed-off-by: Krisztián Szűcs <szucs.krisztian@gmail.com>
…n timezone (apache#45051) ### Rationale for this change If the timezone database is present on the system, but does not contain a timezone referenced in a ORC file, the ORC reader will crash with an uncaught C++ exception. This can happen for example on Ubuntu 24.04 where some timezone aliases have been removed from the main `tzdata` package to a `tzdata-legacy` package. If `tzdata-legacy` is not installed, trying to read a ORC file that references e.g. the "US/Pacific" timezone would crash. Here is a backtrace excerpt: ``` #12 0x00007f1a3ce23a55 in std::terminate() () from /lib/x86_64-linux-gnu/libstdc++.so.6 #13 0x00007f1a3ce39391 in __cxa_throw () from /lib/x86_64-linux-gnu/libstdc++.so.6 #14 0x00007f1a3f4accc4 in orc::loadTZDB(std::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 #15 0x00007f1a3f4ad392 in std::call_once<orc::LazyTimezone::getImpl() const::{lambda()#1}>(std::once_flag&, orc::LazyTimezone::getImpl() const::{lambda()#1}&&)::{lambda()#2}::_FUN() () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 #16 0x00007f1a4298bec3 in __pthread_once_slow (once_control=0xa5ca7c8, init_routine=0x7f1a3ce69420 <__once_proxy>) at ./nptl/pthread_once.c:116 #17 0x00007f1a3f4a9ad0 in orc::LazyTimezone::getEpoch() const () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 #18 0x00007f1a3f4e76b1 in orc::TimestampColumnReader::TimestampColumnReader(orc::Type const&, orc::StripeStreams&, bool) () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 #19 0x00007f1a3f4e84ad in orc::buildReader(orc::Type const&, orc::StripeStreams&, bool, bool, bool) () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 apache#20 0x00007f1a3f4e8dd7 in orc::StructColumnReader::StructColumnReader(orc::Type const&, orc::StripeStreams&, bool, bool) () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 apache#21 0x00007f1a3f4e8532 in orc::buildReader(orc::Type const&, orc::StripeStreams&, bool, bool, bool) () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 apache#22 0x00007f1a3f4925e9 in orc::RowReaderImpl::startNextStripe() () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 apache#23 0x00007f1a3f492c9d in orc::RowReaderImpl::next(orc::ColumnVectorBatch&) () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 apache#24 0x00007f1a3e6b251f in arrow::adapters::orc::ORCFileReader::Impl::ReadBatch(orc::RowReaderOptions const&, std::shared_ptr<arrow::Schema> const&, long) () from /tmp/arrow-HEAD.ArqTs/venv-wheel-3.12-manylinux_2_17_x86_64.manylinux2014_x86_64/lib/python3.12/site-packages/pyarrow/libarrow.so.1900 ``` ### What changes are included in this PR? Catch C++ exceptions when iterating ORC batches instead of letting them slip through. ### Are these changes tested? Yes. ### Are there any user-facing changes? No. * GitHub Issue: apache#40633 Authored-by: Antoine Pitrou <antoine@python.org> Signed-off-by: Sutou Kouhei <kou@clear-code.com>
No description provided.