You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Due to test configuration issues, the tests were not part of tox unit test suite. Enabling them causes failures:
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_model_handler_large_model - AssertionError: True is not false
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_model_handler_sets_env_vars - AssertionError: True is not false
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
Issue Failure
Failure: Test is continually failing
Issue Priority
Priority: 1 (unhealthy code / failing or flaky postcommit so we cannot be sure the product is healthy)
Issue Components
Component: Python SDK
Component: Java SDK
Component: Go SDK
Component: Typescript SDK
Component: IO connector
Component: Beam YAML
Component: Beam examples
Component: Beam playground
Component: Beam katas
Component: Website
Component: Spark Runner
Component: Flink Runner
Component: Samza Runner
Component: Twister2 Runner
Component: Hazelcast Jet Runner
Component: Google Cloud Dataflow Runner
The text was updated successfully, but these errors were encountered:
We should also fix enable onnx in dependency compat test suite: #25796. Beam supports protobuf3, so we should still be able to test onnx even if it doesn't support protobuf4.
What happened?
Due to test configuration issues, the tests were not part of tox unit test suite. Enabling them causes failures:
Issue Failure
Failure: Test is continually failing
Issue Priority
Priority: 1 (unhealthy code / failing or flaky postcommit so we cannot be sure the product is healthy)
Issue Components
The text was updated successfully, but these errors were encountered: