Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: add some sentry emits to debug json error #901

Merged
merged 3 commits into from
Nov 20, 2024

Conversation

joseph-sentry
Copy link
Contributor

i want to find out if the reason the json decode errors are occuring is because we are re-processing uploads that have already been processed before. I also don't want to hard crash if we run into a json decode error

i want to find out if the reason the json decode errors are occuring is
because we are re-processing uploads that have already been processed
before. I also don't want to hard crash if we run into a json decode
error
Copy link

sentry-io bot commented Nov 18, 2024

🔍 Existing Issues For Review

Your pull request is modifying functions with the following pre-existing issues:

📄 File: tasks/test_results_processor.py

Function Unhandled Issue
process_individual_arg FileNotInStorageError: File test_results/v1/raw/2024-11-06/25D3451FB922A5B3C2F2E4A374E5B8F0/96a311895d56ba274a12a4d2cd4d... ...
Event Count: 1

Did you find this useful? React with a 👍 or 👎

@joseph-sentry joseph-sentry requested a review from a team November 18, 2024 15:41
Copy link

codecov bot commented Nov 18, 2024

Codecov Report

Attention: Patch coverage is 57.89474% with 8 lines in your changes missing coverage. Please review.

Project coverage is 98.03%. Comparing base (03c73da) to head (38f2b3a).
Report is 5 commits behind head on main.

✅ All tests successful. No failed tests found.

Files with missing lines Patch % Lines
tasks/test_results_processor.py 57.89% 8 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #901      +/-   ##
==========================================
- Coverage   98.06%   98.03%   -0.03%     
==========================================
  Files         444      444              
  Lines       35474    35492      +18     
==========================================
+ Hits        34786    34796      +10     
- Misses        688      696       +8     
Flag Coverage Δ
integration 41.94% <57.89%> (+<0.01%) ⬆️
unit 90.83% <5.26%> (-0.05%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@codecov-notifications
Copy link

codecov-notifications bot commented Nov 18, 2024

Codecov Report

Attention: Patch coverage is 57.89474% with 8 lines in your changes missing coverage. Please review.

✅ All tests successful. No failed tests found.

Files with missing lines Patch % Lines
tasks/test_results_processor.py 57.89% 8 Missing ⚠️

📢 Thoughts on this report? Let us know!

@codecov-qa
Copy link

codecov-qa bot commented Nov 18, 2024

❌ 8 Tests Failed:

Tests completed Failed Passed Skipped
1744 8 1736 1
View the top 3 failed tests by shortest run time
tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask::test_upload_processor_task_call_daily_test_totals
Stack Traces | 0.074s run time
self = <worker.tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask object at 0x7f62854f6150>
mocker = <pytest_mock.plugin.MockFixture object at 0x7f627ec69af0>
mock_configuration = <shared.config.ConfigHelper object at 0x7f6284800b30>
dbsession = <sqlalchemy.orm.session.Session object at 0x7f6284802120>
codecov_vcr = <vcr.cassette.Cassette object at 0x7f62841188c0>
mock_storage = <shared.storage.memory.MemoryStorageService object at 0x7f627ea732c0>
mock_redis = <MagicMock name='_get_redis_instance_from_url()' id='140061109374080'>
celery_app = <Celery celery.tests at 0x7f627d62dfd0>

    @pytest.mark.integration
    def test_upload_processor_task_call_daily_test_totals(
        self,
        mocker,
        mock_configuration,
        dbsession,
        codecov_vcr,
        mock_storage,
        mock_redis,
        celery_app,
    ):
        with travel("1970-1-1T00:00:00Z", tick=False):
            first_url = ".../C3C4715CA57C910D11D5EB899FC86A7E/4c4e4654ac25037ae869caeb3619d485970b6304/a84d445c-9c1e-434f-8275-f18f1f320f81.txt"
            with open(
                here.parent.parent / "samples" / "sample_multi_test_part_1.json"
            ) as f:
                content = f.read()
                mock_storage.write_file("archive", first_url, content)
    
            first_commit = CommitFactory.create(
                message="hello world",
                commitid="cd76b0821854a780b60012aed85af0a8263004ad",
                repository__owner__unencrypted_oauth_token="test7lk5ndmtqzxlx06rip65nac9c7epqopclnoy",
                repository__owner__username="joseph-sentry",
                repository__owner__service="github",
                repository__name="codecov-demo",
                branch="first_branch",
            )
            dbsession.add(first_commit)
            dbsession.flush()
    
            first_report_row = CommitReport(commit_id=first_commit.id_)
            dbsession.add(first_report_row)
            dbsession.flush()
    
            upload = UploadFactory.create(
                storage_path=first_url, report=first_report_row
            )
            dbsession.add(upload)
            dbsession.flush()
    
            repoid = upload.report.commit.repoid
            redis_queue = [{"url": first_url, "upload_id": upload.id_}]
            mocker.patch.object(TestResultsProcessorTask, "app", celery_app)
    
>           result = TestResultsProcessorTask().run_impl(
                dbsession,
                repoid=repoid,
                commitid=first_commit.commitid,
                commit_yaml={"codecov": {"max_report_age": False}},
                arguments_list=redis_queue,
            )

.../tests/unit/test_test_results_processor_task.py:522: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tasks/test_results_processor.py:138: in run_impl
    result = self.process_individual_upload(
tasks/test_results_processor.py:373: in process_individual_upload
    self.process_individual_arg(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <@task: app.tasks.test_results.TestResultsProcessor of tasks at 0x7f6291523b60>
upload = <database.models.reports.Upload object at 0x7f627d62cd40>
repository = Repo<124>

    def process_individual_arg(
        self, upload: Upload, repository
    ) -> TestResultsProcessingResult:
        archive_service = ArchiveService(repository)
    
        payload_bytes = archive_service.read_file(upload.storage_path)
        try:
            data = json.loads(payload_bytes)
        except json.JSONDecodeError as e:
            with sentry_sdk.new_scope() as scope:
                scope.set_extra("upload_state", upload.state)
                scope.set_extra("contents", payload_bytes[:10])
                sentry_sdk.capture_message("Upload payload is not valid JSON")
    
        parsing_results: list[ParsingInfo] = []
    
        network: list[str] | None = data.get("network_files")
    
        report_contents = []
    
        for file_dict in data["test_results_files"]:
            file = file_dict["data"]
            file_bytes = BytesIO(zlib.decompress(base64.b64decode(file)))
            report_contents.append(
                ReadableFile(path=file_dict["filename"], contents=file_bytes.getvalue())
            )
            try:
                parsing_results.append(self.parse_single_file(file_bytes))
            except ParserFailureError as exc:
                log.error(
                    exc.err_msg,
                    extra=dict(
                        repoid=upload.report.commit.repoid,
                        commitid=upload.report.commit_id,
                        uploadid=upload.id,
                        file_content=exc.file_content,
                        parser_err_msg=exc.parser_err_msg,
                    ),
                )
                with sentry_sdk.new_scope() as scope:
                    scope.set_extra("upload_state", upload.state)
                    scope.set_extra("parser_error", exc.parser_err_msg)
                    sentry_sdk.capture_message("Test results parser error")
                    upload.state = "has_failed"
    
        if upload.state != "has_failed":
            upload.state = "processed"
    
>       upload.save()
E       AttributeError: 'Upload' object has no attribute 'save'

tasks/test_results_processor.py:475: AttributeError
tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask::test_upload_processor_task_call_existing_test
Stack Traces | 0.08s run time
self = <worker.tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask object at 0x7f62854f5070>
mocker = <pytest_mock.plugin.MockFixture object at 0x7f62842c26c0>
mock_configuration = <shared.config.ConfigHelper object at 0x7f628423e240>
dbsession = <sqlalchemy.orm.session.Session object at 0x7f62848003e0>
codecov_vcr = <vcr.cassette.Cassette object at 0x7f627d7f7a70>
mock_storage = <shared.storage.memory.MemoryStorageService object at 0x7f6284a5deb0>
mock_redis = <MagicMock name='_get_redis_instance_from_url()' id='140061013030048'>
celery_app = <Celery celery.tests at 0x7f6284800230>

    @pytest.mark.integration
    def test_upload_processor_task_call_existing_test(
        self,
        mocker,
        mock_configuration,
        dbsession,
        codecov_vcr,
        mock_storage,
        mock_redis,
        celery_app,
    ):
        url = ".../C3C4715CA57C910D11D5EB899FC86A7E/4c4e4654ac25037ae869caeb3619d485970b6304/a84d445c-9c1e-434f-8275-f18f1f320f81.txt"
        with open(here.parent.parent / "samples" / "sample_test.json") as f:
            content = f.read()
            mock_storage.write_file("archive", url, content)
        upload = UploadFactory.create(
            storage_path=url,
        )
        dbsession.add(upload)
        dbsession.flush()
        repoid = upload.report.commit.repoid
        redis_queue = [{"url": url, "upload_id": upload.id_}]
        mocker.patch.object(TestResultsProcessorTask, "app", celery_app)
    
        commit = CommitFactory.create(
            message="hello world",
            commitid="cd76b0821854a780b60012aed85af0a8263004ad",
            repository__owner__unencrypted_oauth_token="test7lk5ndmtqzxlx06rip65nac9c7epqopclnoy",
            repository__owner__username="joseph-sentry",
            repository__owner__service="github",
            repository__name="codecov-demo",
        )
        dbsession.add(commit)
        dbsession.flush()
        current_report_row = CommitReport(commit_id=commit.id_)
        dbsession.add(current_report_row)
        dbsession.flush()
    
        test_id = generate_test_id(
            repoid,
            "pytest",
            "api.temp.calculator.test_calculator\x1ftest_divide",
            "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855",
        )
        existing_test = Test(
            repoid=repoid,
            flags_hash="e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855",
            name="api.temp.calculator.test_calculator\x1ftest_divide",
            testsuite="pytest",
            id_=test_id,
        )
        dbsession.add(existing_test)
        dbsession.flush()
    
>       result = TestResultsProcessorTask().run_impl(
            dbsession,
            repoid=repoid,
            commitid=commit.commitid,
            commit_yaml={"codecov": {"max_report_age": False}},
            arguments_list=redis_queue,
        )

.../tests/unit/test_test_results_processor_task.py:340: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tasks/test_results_processor.py:138: in run_impl
    result = self.process_individual_upload(
tasks/test_results_processor.py:373: in process_individual_upload
    self.process_individual_arg(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <@task: app.tasks.test_results.TestResultsProcessor of tasks at 0x7f6291523b60>
upload = <database.models.reports.Upload object at 0x7f6284800c20>
repository = Repo<120>

    def process_individual_arg(
        self, upload: Upload, repository
    ) -> TestResultsProcessingResult:
        archive_service = ArchiveService(repository)
    
        payload_bytes = archive_service.read_file(upload.storage_path)
        try:
            data = json.loads(payload_bytes)
        except json.JSONDecodeError as e:
            with sentry_sdk.new_scope() as scope:
                scope.set_extra("upload_state", upload.state)
                scope.set_extra("contents", payload_bytes[:10])
                sentry_sdk.capture_message("Upload payload is not valid JSON")
    
        parsing_results: list[ParsingInfo] = []
    
        network: list[str] | None = data.get("network_files")
    
        report_contents = []
    
        for file_dict in data["test_results_files"]:
            file = file_dict["data"]
            file_bytes = BytesIO(zlib.decompress(base64.b64decode(file)))
            report_contents.append(
                ReadableFile(path=file_dict["filename"], contents=file_bytes.getvalue())
            )
            try:
                parsing_results.append(self.parse_single_file(file_bytes))
            except ParserFailureError as exc:
                log.error(
                    exc.err_msg,
                    extra=dict(
                        repoid=upload.report.commit.repoid,
                        commitid=upload.report.commit_id,
                        uploadid=upload.id,
                        file_content=exc.file_content,
                        parser_err_msg=exc.parser_err_msg,
                    ),
                )
                with sentry_sdk.new_scope() as scope:
                    scope.set_extra("upload_state", upload.state)
                    scope.set_extra("parser_error", exc.parser_err_msg)
                    sentry_sdk.capture_message("Test results parser error")
                    upload.state = "has_failed"
    
        if upload.state != "has_failed":
            upload.state = "processed"
    
>       upload.save()
E       AttributeError: 'Upload' object has no attribute 'save'

tasks/test_results_processor.py:475: AttributeError
tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask::test_upload_processor_task_call_network
Stack Traces | 0.08s run time
self = <worker.tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask object at 0x7f62854f6930>
mocker = <pytest_mock.plugin.MockFixture object at 0x7f627eeacdd0>
mock_configuration = <shared.config.ConfigHelper object at 0x7f627eeae7b0>
dbsession = <sqlalchemy.orm.session.Session object at 0x7f627eeac3b0>
codecov_vcr = <vcr.cassette.Cassette object at 0x7f627eeac230>
mock_storage = <shared.storage.memory.MemoryStorageService object at 0x7f627d7f7e90>
mock_redis = <MagicMock name='_get_redis_instance_from_url()' id='140060989021200'>
celery_app = <Celery celery.tests at 0x7f62846465d0>

    @pytest.mark.integration
    def test_upload_processor_task_call_network(
        self,
        mocker,
        mock_configuration,
        dbsession,
        codecov_vcr,
        mock_storage,
        mock_redis,
        celery_app,
    ):
        tests = dbsession.query(Test).all()
        test_instances = dbsession.query(TestInstance).all()
        assert len(tests) == 0
        assert len(test_instances) == 0
    
        url = ".../C3C4715CA57C910D11D5EB899FC86A7E/4c4e4654ac25037ae869caeb3619d485970b6304/a84d445c-9c1e-434f-8275-f18f1f320f81.txt"
        with open(
            here.parent.parent / "samples" / "sample_test_missing_network.json"
        ) as f:
            content = f.read()
            mock_storage.write_file("archive", url, content)
        upload = UploadFactory.create(storage_path=url)
        dbsession.add(upload)
        dbsession.flush()
        redis_queue = [{"url": url, "upload_id": upload.id_}]
        mocker.patch.object(TestResultsProcessorTask, "app", celery_app)
    
        commit = CommitFactory.create(
            message="hello world",
            commitid="cd76b0821854a780b60012aed85af0a8263004ad",
            repository__owner__unencrypted_oauth_token="test7lk5ndmtqzxlx06rip65nac9c7epqopclnoy",
            repository__owner__username="joseph-sentry",
            repository__owner__service="github",
            repository__name="codecov-demo",
        )
        dbsession.add(commit)
        dbsession.flush()
        current_report_row = CommitReport(commit_id=commit.id_)
        dbsession.add(current_report_row)
        dbsession.flush()
>       result = TestResultsProcessorTask().run_impl(
            dbsession,
            repoid=upload.report.commit.repoid,
            commitid=commit.commitid,
            commit_yaml={"codecov": {"max_report_age": False}},
            arguments_list=redis_queue,
        )

.../tests/unit/test_test_results_processor_task.py:691: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tasks/test_results_processor.py:138: in run_impl
    result = self.process_individual_upload(
tasks/test_results_processor.py:373: in process_individual_upload
    self.process_individual_arg(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <@task: app.tasks.test_results.TestResultsProcessor of tasks at 0x7f6291523b60>
upload = <database.models.reports.Upload object at 0x7f62842c0740>
repository = Repo<125>

    def process_individual_arg(
        self, upload: Upload, repository
    ) -> TestResultsProcessingResult:
        archive_service = ArchiveService(repository)
    
        payload_bytes = archive_service.read_file(upload.storage_path)
        try:
            data = json.loads(payload_bytes)
        except json.JSONDecodeError as e:
            with sentry_sdk.new_scope() as scope:
                scope.set_extra("upload_state", upload.state)
                scope.set_extra("contents", payload_bytes[:10])
                sentry_sdk.capture_message("Upload payload is not valid JSON")
    
        parsing_results: list[ParsingInfo] = []
    
        network: list[str] | None = data.get("network_files")
    
        report_contents = []
    
        for file_dict in data["test_results_files"]:
            file = file_dict["data"]
            file_bytes = BytesIO(zlib.decompress(base64.b64decode(file)))
            report_contents.append(
                ReadableFile(path=file_dict["filename"], contents=file_bytes.getvalue())
            )
            try:
                parsing_results.append(self.parse_single_file(file_bytes))
            except ParserFailureError as exc:
                log.error(
                    exc.err_msg,
                    extra=dict(
                        repoid=upload.report.commit.repoid,
                        commitid=upload.report.commit_id,
                        uploadid=upload.id,
                        file_content=exc.file_content,
                        parser_err_msg=exc.parser_err_msg,
                    ),
                )
                with sentry_sdk.new_scope() as scope:
                    scope.set_extra("upload_state", upload.state)
                    scope.set_extra("parser_error", exc.parser_err_msg)
                    sentry_sdk.capture_message("Test results parser error")
                    upload.state = "has_failed"
    
        if upload.state != "has_failed":
            upload.state = "processed"
    
>       upload.save()
E       AttributeError: 'Upload' object has no attribute 'save'

tasks/test_results_processor.py:475: AttributeError

To view more test analytics, go to the Test Analytics Dashboard
Got feedback? Let us know on Github

Copy link

❌ 8 Tests Failed:

Tests completed Failed Passed Skipped
1744 8 1736 1
View the top 3 failed tests by shortest run time
tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask test_upload_processor_task_call_daily_test_totals
Stack Traces | 0.074s run time
self = &lt;worker.tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask object at 0x7f62854f6150&gt;
mocker = &lt;pytest_mock.plugin.MockFixture object at 0x7f627ec69af0&gt;
mock_configuration = &lt;shared.config.ConfigHelper object at 0x7f6284800b30&gt;
dbsession = &lt;sqlalchemy.orm.session.Session object at 0x7f6284802120&gt;
codecov_vcr = &lt;vcr.cassette.Cassette object at 0x7f62841188c0&gt;
mock_storage = &lt;shared.storage.memory.MemoryStorageService object at 0x7f627ea732c0&gt;
mock_redis = &lt;MagicMock name='_get_redis_instance_from_url()' id='140061109374080'&gt;
celery_app = &lt;Celery celery.tests at 0x7f627d62dfd0&gt;

    @pytest.mark.integration
    def test_upload_processor_task_call_daily_test_totals(
        self,
        mocker,
        mock_configuration,
        dbsession,
        codecov_vcr,
        mock_storage,
        mock_redis,
        celery_app,
    ):
        with travel("1970-1-1T00:00:00Z", tick=False):
            first_url = ".../C3C4715CA57C910D11D5EB899FC86A7E/4c4e4654ac25037ae869caeb3619d485970b6304/a84d445c-9c1e-434f-8275-f18f1f320f81.txt"
            with open(
                here.parent.parent / "samples" / "sample_multi_test_part_1.json"
            ) as f:
                content = f.read()
                mock_storage.write_file("archive", first_url, content)
    
            first_commit = CommitFactory.create(
                message="hello world",
                commitid="cd76b0821854a780b60012aed85af0a8263004ad",
                repository__owner__unencrypted_oauth_token="test7lk5ndmtqzxlx06rip65nac9c7epqopclnoy",
                repository__owner__username="joseph-sentry",
                repository__owner__service="github",
                repository__name="codecov-demo",
                branch="first_branch",
            )
            dbsession.add(first_commit)
            dbsession.flush()
    
            first_report_row = CommitReport(commit_id=first_commit.id_)
            dbsession.add(first_report_row)
            dbsession.flush()
    
            upload = UploadFactory.create(
                storage_path=first_url, report=first_report_row
            )
            dbsession.add(upload)
            dbsession.flush()
    
            repoid = upload.report.commit.repoid
            redis_queue = [{"url": first_url, "upload_id": upload.id_}]
            mocker.patch.object(TestResultsProcessorTask, "app", celery_app)
    
&gt;           result = TestResultsProcessorTask().run_impl(
                dbsession,
                repoid=repoid,
                commitid=first_commit.commitid,
                commit_yaml={"codecov": {"max_report_age": False}},
                arguments_list=redis_queue,
            )

.../tests/unit/test_test_results_processor_task.py:522: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tasks/test_results_processor.py:138: in run_impl
    result = self.process_individual_upload(
tasks/test_results_processor.py:373: in process_individual_upload
    self.process_individual_arg(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = &lt;@task: app.tasks.test_results.TestResultsProcessor of tasks at 0x7f6291523b60&gt;
upload = &lt;database.models.reports.Upload object at 0x7f627d62cd40&gt;
repository = Repo&lt;124&gt;

    def process_individual_arg(
        self, upload: Upload, repository
    ) -&gt; TestResultsProcessingResult:
        archive_service = ArchiveService(repository)
    
        payload_bytes = archive_service.read_file(upload.storage_path)
        try:
            data = json.loads(payload_bytes)
        except json.JSONDecodeError as e:
            with sentry_sdk.new_scope() as scope:
                scope.set_extra("upload_state", upload.state)
                scope.set_extra("contents", payload_bytes[:10])
                sentry_sdk.capture_message("Upload payload is not valid JSON")
    
        parsing_results: list[ParsingInfo] = []
    
        network: list[str] | None = data.get("network_files")
    
        report_contents = []
    
        for file_dict in data["test_results_files"]:
            file = file_dict["data"]
            file_bytes = BytesIO(zlib.decompress(base64.b64decode(file)))
            report_contents.append(
                ReadableFile(path=file_dict["filename"], contents=file_bytes.getvalue())
            )
            try:
                parsing_results.append(self.parse_single_file(file_bytes))
            except ParserFailureError as exc:
                log.error(
                    exc.err_msg,
                    extra=dict(
                        repoid=upload.report.commit.repoid,
                        commitid=upload.report.commit_id,
                        uploadid=upload.id,
                        file_content=exc.file_content,
                        parser_err_msg=exc.parser_err_msg,
                    ),
                )
                with sentry_sdk.new_scope() as scope:
                    scope.set_extra("upload_state", upload.state)
                    scope.set_extra("parser_error", exc.parser_err_msg)
                    sentry_sdk.capture_message("Test results parser error")
                    upload.state = "has_failed"
    
        if upload.state != "has_failed":
            upload.state = "processed"
    
&gt;       upload.save()
E       AttributeError: 'Upload' object has no attribute 'save'

tasks/test_results_processor.py:475: AttributeError
tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask test_test_result_processor_task_bad_file
Stack Traces | 0.08s run time
self = &lt;worker.tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask object at 0x7f62854f4860&gt;
caplog = &lt;_pytest.logging.LogCaptureFixture object at 0x7f627ec6bfb0&gt;
mocker = &lt;pytest_mock.plugin.MockFixture object at 0x7f627ec6bb30&gt;
mock_configuration = &lt;shared.config.ConfigHelper object at 0x7f627ec1e630&gt;
dbsession = &lt;sqlalchemy.orm.session.Session object at 0x7f627ec1fa10&gt;
codecov_vcr = &lt;vcr.cassette.Cassette object at 0x7f62842c23c0&gt;
mock_storage = &lt;shared.storage.memory.MemoryStorageService object at 0x7f627d7d7800&gt;
mock_redis = &lt;MagicMock name='_get_redis_instance_from_url()' id='140060988894784'&gt;
celery_app = &lt;Celery celery.tests at 0x7f627d7d6930&gt;

    @pytest.mark.integration
    def test_test_result_processor_task_bad_file(
        self,
        caplog,
        mocker,
        mock_configuration,
        dbsession,
        codecov_vcr,
        mock_storage,
        mock_redis,
        celery_app,
    ):
        url = ".../C3C4715CA57C910D11D5EB899FC86A7E/4c4e4654ac25037ae869caeb3619d485970b6304/a84d445c-9c1e-434f-8275-f18f1f320f81.txt"
        mock_storage.write_file(
            "archive",
            url,
            b'{"test_results_files": [{"filename": "blah", "format": "blah", "data": "eJxLyknMSIJiAB8CBMY="}]}',
        )
        upload = UploadFactory.create(storage_path=url)
        dbsession.add(upload)
        dbsession.flush()
        redis_queue = [{"url": url, "upload_id": upload.id_}]
        mocker.patch.object(TestResultsProcessorTask, "app", celery_app)
    
        commit = CommitFactory.create(
            message="hello world",
            commitid="cd76b0821854a780b60012aed85af0a8263004ad",
            repository__owner__unencrypted_oauth_token="test7lk5ndmtqzxlx06rip65nac9c7epqopclnoy",
            repository__owner__username="joseph-sentry",
            repository__owner__service="github",
            repository__name="codecov-demo",
        )
    
        dbsession.add(commit)
        dbsession.flush()
        current_report_row = CommitReport(commit_id=commit.id_)
        dbsession.add(current_report_row)
        dbsession.flush()
&gt;       result = TestResultsProcessorTask().run_impl(
            dbsession,
            repoid=commit.repoid,
            commitid=commit.commitid,
            commit_yaml={"codecov": {"max_report_age": False}},
            arguments_list=redis_queue,
        )

.../tests/unit/test_test_results_processor_task.py:271: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tasks/test_results_processor.py:138: in run_impl
    result = self.process_individual_upload(
tasks/test_results_processor.py:373: in process_individual_upload
    self.process_individual_arg(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = &lt;@task: app.tasks.test_results.TestResultsProcessor of tasks at 0x7f6291523b60&gt;
upload = &lt;database.models.reports.Upload object at 0x7f627eef1eb0&gt;
repository = Repo&lt;118&gt;

    def process_individual_arg(
        self, upload: Upload, repository
    ) -&gt; TestResultsProcessingResult:
        archive_service = ArchiveService(repository)
    
        payload_bytes = archive_service.read_file(upload.storage_path)
        try:
            data = json.loads(payload_bytes)
        except json.JSONDecodeError as e:
            with sentry_sdk.new_scope() as scope:
                scope.set_extra("upload_state", upload.state)
                scope.set_extra("contents", payload_bytes[:10])
                sentry_sdk.capture_message("Upload payload is not valid JSON")
    
        parsing_results: list[ParsingInfo] = []
    
        network: list[str] | None = data.get("network_files")
    
        report_contents = []
    
        for file_dict in data["test_results_files"]:
            file = file_dict["data"]
            file_bytes = BytesIO(zlib.decompress(base64.b64decode(file)))
            report_contents.append(
                ReadableFile(path=file_dict["filename"], contents=file_bytes.getvalue())
            )
            try:
                parsing_results.append(self.parse_single_file(file_bytes))
            except ParserFailureError as exc:
                log.error(
                    exc.err_msg,
                    extra=dict(
                        repoid=upload.report.commit.repoid,
                        commitid=upload.report.commit_id,
                        uploadid=upload.id,
                        file_content=exc.file_content,
                        parser_err_msg=exc.parser_err_msg,
                    ),
                )
                with sentry_sdk.new_scope() as scope:
                    scope.set_extra("upload_state", upload.state)
                    scope.set_extra("parser_error", exc.parser_err_msg)
                    sentry_sdk.capture_message("Test results parser error")
                    upload.state = "has_failed"
    
        if upload.state != "has_failed":
            upload.state = "processed"
    
&gt;       upload.save()
E       AttributeError: 'Upload' object has no attribute 'save'

tasks/test_results_processor.py:475: AttributeError
tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask test_upload_processor_task_call_network
Stack Traces | 0.08s run time
self = &lt;worker.tasks.tests.unit.test_test_results_processor_task.TestUploadTestProcessorTask object at 0x7f62854f6930&gt;
mocker = &lt;pytest_mock.plugin.MockFixture object at 0x7f627eeacdd0&gt;
mock_configuration = &lt;shared.config.ConfigHelper object at 0x7f627eeae7b0&gt;
dbsession = &lt;sqlalchemy.orm.session.Session object at 0x7f627eeac3b0&gt;
codecov_vcr = &lt;vcr.cassette.Cassette object at 0x7f627eeac230&gt;
mock_storage = &lt;shared.storage.memory.MemoryStorageService object at 0x7f627d7f7e90&gt;
mock_redis = &lt;MagicMock name='_get_redis_instance_from_url()' id='140060989021200'&gt;
celery_app = &lt;Celery celery.tests at 0x7f62846465d0&gt;

    @pytest.mark.integration
    def test_upload_processor_task_call_network(
        self,
        mocker,
        mock_configuration,
        dbsession,
        codecov_vcr,
        mock_storage,
        mock_redis,
        celery_app,
    ):
        tests = dbsession.query(Test).all()
        test_instances = dbsession.query(TestInstance).all()
        assert len(tests) == 0
        assert len(test_instances) == 0
    
        url = ".../C3C4715CA57C910D11D5EB899FC86A7E/4c4e4654ac25037ae869caeb3619d485970b6304/a84d445c-9c1e-434f-8275-f18f1f320f81.txt"
        with open(
            here.parent.parent / "samples" / "sample_test_missing_network.json"
        ) as f:
            content = f.read()
            mock_storage.write_file("archive", url, content)
        upload = UploadFactory.create(storage_path=url)
        dbsession.add(upload)
        dbsession.flush()
        redis_queue = [{"url": url, "upload_id": upload.id_}]
        mocker.patch.object(TestResultsProcessorTask, "app", celery_app)
    
        commit = CommitFactory.create(
            message="hello world",
            commitid="cd76b0821854a780b60012aed85af0a8263004ad",
            repository__owner__unencrypted_oauth_token="test7lk5ndmtqzxlx06rip65nac9c7epqopclnoy",
            repository__owner__username="joseph-sentry",
            repository__owner__service="github",
            repository__name="codecov-demo",
        )
        dbsession.add(commit)
        dbsession.flush()
        current_report_row = CommitReport(commit_id=commit.id_)
        dbsession.add(current_report_row)
        dbsession.flush()
&gt;       result = TestResultsProcessorTask().run_impl(
            dbsession,
            repoid=upload.report.commit.repoid,
            commitid=commit.commitid,
            commit_yaml={"codecov": {"max_report_age": False}},
            arguments_list=redis_queue,
        )

.../tests/unit/test_test_results_processor_task.py:691: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
tasks/test_results_processor.py:138: in run_impl
    result = self.process_individual_upload(
tasks/test_results_processor.py:373: in process_individual_upload
    self.process_individual_arg(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = &lt;@task: app.tasks.test_results.TestResultsProcessor of tasks at 0x7f6291523b60&gt;
upload = &lt;database.models.reports.Upload object at 0x7f62842c0740&gt;
repository = Repo&lt;125&gt;

    def process_individual_arg(
        self, upload: Upload, repository
    ) -&gt; TestResultsProcessingResult:
        archive_service = ArchiveService(repository)
    
        payload_bytes = archive_service.read_file(upload.storage_path)
        try:
            data = json.loads(payload_bytes)
        except json.JSONDecodeError as e:
            with sentry_sdk.new_scope() as scope:
                scope.set_extra("upload_state", upload.state)
                scope.set_extra("contents", payload_bytes[:10])
                sentry_sdk.capture_message("Upload payload is not valid JSON")
    
        parsing_results: list[ParsingInfo] = []
    
        network: list[str] | None = data.get("network_files")
    
        report_contents = []
    
        for file_dict in data["test_results_files"]:
            file = file_dict["data"]
            file_bytes = BytesIO(zlib.decompress(base64.b64decode(file)))
            report_contents.append(
                ReadableFile(path=file_dict["filename"], contents=file_bytes.getvalue())
            )
            try:
                parsing_results.append(self.parse_single_file(file_bytes))
            except ParserFailureError as exc:
                log.error(
                    exc.err_msg,
                    extra=dict(
                        repoid=upload.report.commit.repoid,
                        commitid=upload.report.commit_id,
                        uploadid=upload.id,
                        file_content=exc.file_content,
                        parser_err_msg=exc.parser_err_msg,
                    ),
                )
                with sentry_sdk.new_scope() as scope:
                    scope.set_extra("upload_state", upload.state)
                    scope.set_extra("parser_error", exc.parser_err_msg)
                    sentry_sdk.capture_message("Test results parser error")
                    upload.state = "has_failed"
    
        if upload.state != "has_failed":
            upload.state = "processed"
    
&gt;       upload.save()
E       AttributeError: 'Upload' object has no attribute 'save'

tasks/test_results_processor.py:475: AttributeError

To view individual test run time comparison to the main branch, go to the Test Analytics Dashboard

Copy link

github-actions bot commented Nov 18, 2024

✅ All tests successful. No failed tests were found.

📣 Thoughts on this report? Let Codecov know! | Powered by Codecov

data = json.loads(payload_bytes)
try:
data = json.loads(payload_bytes)
except json.JSONDecodeError as e:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this be the only exception json.loads gets?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe not, but it's the only one we care about in this case

Copy link
Contributor

@Swatinem Swatinem left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for both these, I would use capture_exception, as you do have an exception object present.

@joseph-sentry joseph-sentry added this pull request to the merge queue Nov 20, 2024
Merged via the queue into main with commit 0a2733e Nov 20, 2024
18 of 27 checks passed
@joseph-sentry joseph-sentry deleted the joseph/json-decode-error branch November 20, 2024 13:57
Copy link

sentry-io bot commented Nov 21, 2024

Suspect Issues

This pull request was deployed and Sentry observed the following issues:

  • ‼️ ChordError: Dependency 70237d57-3767-4b5b-8033-043363bb75f1 raised UnicodeDecodeError('utf-8', b'\x03\x00\x08... app.tasks.test_results.TestResultsProcessor View Issue
  • ‼️ UnicodeDecodeError: 'utf-8' codec can't decode byte 0x98 in position 4: invalid start byte app.tasks.test_results.TestResultsProcessor View Issue

Did you find this useful? React with a 👍 or 👎

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants