Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add skip tf crit unit #561

Merged
merged 7 commits into from
Aug 25, 2022
Merged

Conversation

jperez999
Copy link
Collaborator

shuffle up the tensorflow import or skip so that it happens before attempting to import tensorflow in merlin.

@karlhigley karlhigley added chore Infrastructure update ci labels Aug 25, 2022
@karlhigley karlhigley added this to the Merlin 22.08 milestone Aug 25, 2022
@github-actions
Copy link

Documentation preview

https://nvidia-merlin.github.io/Merlin/review/pr-561

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #561 of commit 86aba158dbfa8ca6aba92aff92d86417142e068d, no merge conflicts.
Running as SYSTEM
Setting status of 86aba158dbfa8ca6aba92aff92d86417142e068d to PENDING with url https://10.20.13.93:8080/job/merlin_merlin/373/console and message: 'Pending'
Using context: Jenkins
Building on master in workspace /var/jenkins_home/workspace/merlin_merlin
using credential systems-login
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/Merlin # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/Merlin
 > git --version # timeout=10
using GIT_ASKPASS to set credentials login for merlin-systems
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/Merlin +refs/pull/561/*:refs/remotes/origin/pr/561/* # timeout=10
 > git rev-parse 86aba158dbfa8ca6aba92aff92d86417142e068d^{commit} # timeout=10
Checking out Revision 86aba158dbfa8ca6aba92aff92d86417142e068d (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 86aba158dbfa8ca6aba92aff92d86417142e068d # timeout=10
Commit message: "add import or skip BEFORE trying import of merlin tf"
 > git rev-list --no-walk 9de242652d08e80e3789ae54510fc57efe3bce00 # timeout=10
[merlin_merlin] $ /bin/bash /tmp/jenkins14404154438885135386.sh
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.2, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/merlin_merlin/merlin
plugins: anyio-3.6.1, xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 3 items

tests/unit/test_version.py . [ 33%]
tests/unit/examples/test_building_deploying_multi_stage_RecSys.py s [ 66%]
tests/unit/examples/test_scaling_criteo_merlin_models.py . [100%]

=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

../../../.local/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============ 2 passed, 1 skipped, 35 warnings in 112.67s (0:01:52) =============
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/Merlin/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[merlin_merlin] $ /bin/bash /tmp/jenkins1443020115543743774.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #561 of commit f390f3aca3d0344033d7e7c0a8ff8cec991c8b68, no merge conflicts.
Running as SYSTEM
Setting status of f390f3aca3d0344033d7e7c0a8ff8cec991c8b68 to PENDING with url https://10.20.13.93:8080/job/merlin_merlin/374/console and message: 'Pending'
Using context: Jenkins
Building on master in workspace /var/jenkins_home/workspace/merlin_merlin
using credential systems-login
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/Merlin # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/Merlin
 > git --version # timeout=10
using GIT_ASKPASS to set credentials login for merlin-systems
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/Merlin +refs/pull/561/*:refs/remotes/origin/pr/561/* # timeout=10
 > git rev-parse f390f3aca3d0344033d7e7c0a8ff8cec991c8b68^{commit} # timeout=10
Checking out Revision f390f3aca3d0344033d7e7c0a8ff8cec991c8b68 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f390f3aca3d0344033d7e7c0a8ff8cec991c8b68 # timeout=10
Commit message: "Merge branch 'main' into add-skip-tf-crit-unit"
 > git rev-list --no-walk 86aba158dbfa8ca6aba92aff92d86417142e068d # timeout=10
[merlin_merlin] $ /bin/bash /tmp/jenkins12546766281865589515.sh
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.2, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/merlin_merlin/merlin
plugins: anyio-3.6.1, xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 3 items

tests/unit/test_version.py . [ 33%]
tests/unit/examples/test_building_deploying_multi_stage_RecSys.py s [ 66%]
tests/unit/examples/test_scaling_criteo_merlin_models.py F [100%]

=================================== FAILURES ===================================
__________________________________ test_func ___________________________________

def test_func():
    with testbook(
        REPO_ROOT / "examples" / "scaling-criteo" / "02-ETL-with-NVTabular.ipynb",
        execute=False,
        timeout=180,
    ) as tb1:
        tb1.inject(
            """
            import os
            os.environ["BASE_DIR"] = "/tmp/input/criteo/"
            os.environ["INPUT_DATA_DIR"] = "/tmp/input/criteo/"
            os.environ["OUTPUT_DATA_DIR"] = "/tmp/output/criteo/"
            os.system("mkdir -p /tmp/input/criteo")
            os.system("mkdir -p /tmp/output/criteo")

            from merlin.datasets.synthetic import generate_data

            train, valid = generate_data("criteo", int(1000000), set_sizes=(0.7, 0.3))

            train.to_ddf().compute().to_parquet('/tmp/input/criteo/day_0.parquet')
            valid.to_ddf().compute().to_parquet('/tmp/input/criteo/day_1.parquet')
            """
        )
      tb1.execute()

tests/unit/examples/test_scaling_criteo_merlin_models.py:37:


/usr/local/lib/python3.8/dist-packages/testbook/client.py:147: in execute
super().execute_cell(cell, index)
/usr/local/lib/python3.8/dist-packages/nbclient/util.py:85: in wrapped
return just_run(coro(*args, **kwargs))
/usr/local/lib/python3.8/dist-packages/nbclient/util.py:60: in just_run
return loop.run_until_complete(coro)
/usr/lib/python3.8/asyncio/base_events.py:616: in run_until_complete
return future.result()
/usr/local/lib/python3.8/dist-packages/nbclient/client.py:1025: in async_execute_cell
await self._check_raise_for_error(cell, cell_index, exec_reply)


self = <testbook.client.TestbookNotebookClient object at 0x7ff132c16bb0>
cell = {'id': '51b61bc9', 'cell_type': 'code', 'metadata': {'execution': {'iopub.status.busy': '2022-08-25T17:32:19.875001Z',...ry: CUDA error at: /usr/include/rmm/mr/device/cuda_memory_resource.hpp:70: cudaErrorMemoryAllocation out of memory']}]}
cell_index = 27
exec_reply = {'buffers': [], 'content': {'ename': 'MemoryError', 'engine_info': {'engine_id': -1, 'engine_uuid': '0b3ce05b-6353-4be...e, 'engine': '0b3ce05b-6353-4bec-9980-4986e72e8001', 'started': '2022-08-25T17:32:19.875274Z', 'status': 'error'}, ...}

async def _check_raise_for_error(
    self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
) -> None:

    if exec_reply is None:
        return None

    exec_reply_content = exec_reply['content']
    if exec_reply_content['status'] != 'error':
        return None

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors
        or exec_reply_content.get('ename') in self.allow_error_names
        or "raises-exception" in cell.metadata.get("tags", [])
    )
    await run_hook(
        self.on_cell_error, cell=cell, cell_index=cell_index, execute_reply=exec_reply
    )
    if not cell_allows_errors:
      raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)

E nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E
E import os
E os.environ["BASE_DIR"] = "/tmp/input/criteo/"
E os.environ["INPUT_DATA_DIR"] = "/tmp/input/criteo/"
E os.environ["OUTPUT_DATA_DIR"] = "/tmp/output/criteo/"
E os.system("mkdir -p /tmp/input/criteo")
E os.system("mkdir -p /tmp/output/criteo")
E
E from merlin.datasets.synthetic import generate_data
E
E train, valid = generate_data("criteo", int(1000000), set_sizes=(0.7, 0.3))
E
E train.to_ddf().compute().to_parquet('/tmp/input/criteo/day_0.parquet')
E valid.to_ddf().compute().to_parquet('/tmp/input/criteo/day_1.parquet')
E
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mMemoryError�[0m Traceback (most recent call last)
E Input �[0;32mIn [16]�[0m, in �[0;36m<cell line: 12>�[0;34m()�[0m
E �[1;32m 8�[0m �[38;5;28;01mfrom�[39;00m �[38;5;21;01mmerlin�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mdatasets�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01msynthetic�[39;00m �[38;5;28;01mimport�[39;00m generate_data
E �[1;32m 10�[0m train, valid �[38;5;241m=�[39m generate_data(�[38;5;124m"�[39m�[38;5;124mcriteo�[39m�[38;5;124m"�[39m, �[38;5;28mint�[39m(�[38;5;241m1000000�[39m), set_sizes�[38;5;241m=�[39m(�[38;5;241m0.7�[39m, �[38;5;241m0.3�[39m))
E �[0;32m---> 12�[0m �[43mtrain�[49m�[38;5;241;43m.�[39;49m�[43mto_ddf�[49m�[43m(�[49m�[43m)�[49m�[38;5;241;43m.�[39;49m�[43mcompute�[49m�[43m(�[49m�[43m)�[49m�[38;5;241;43m.�[39;49m�[43mto_parquet�[49m�[43m(�[49m�[38;5;124;43m'�[39;49m�[38;5;124;43m/tmp/input/criteo/day_0.parquet�[39;49m�[38;5;124;43m'�[39;49m�[43m)�[49m
E �[1;32m 13�[0m valid�[38;5;241m.�[39mto_ddf()�[38;5;241m.�[39mcompute()�[38;5;241m.�[39mto_parquet(�[38;5;124m'�[39m�[38;5;124m/tmp/input/criteo/day_1.parquet�[39m�[38;5;124m'�[39m)
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/cudf/core/dataframe.py:5535�[0m, in �[0;36mDataFrame.to_parquet�[0;34m(self, path, args, **kwargs)�[0m
E �[1;32m 5532�[0m �[38;5;124;03m"""{docstring}"""�[39;00m
E �[1;32m 5533�[0m �[38;5;28;01mfrom�[39;00m �[38;5;21;01mcudf�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mio�[39;00m �[38;5;28;01mimport�[39;00m parquet �[38;5;28;01mas�[39;00m pq
E �[0;32m-> 5535�[0m �[38;5;28;01mreturn�[39;00m �[43mpq�[49m�[38;5;241;43m.�[39;49m�[43mto_parquet�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43mpath�[49m�[43m,�[49m�[43m �[49m�[38;5;241;43m
�[39;49m�[43margs�[49m�[43m,�[49m�[43m �[49m�[38;5;241;43m�[39;49m�[38;5;241;43m�[39;49m�[43mkwargs�[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py:101�[0m, in �[0;36mannotate.call..inner�[0;34m(args, **kwargs)�[0m
E �[1;32m 98�[0m �[38;5;129m@wraps�[39m(func)
E �[1;32m 99�[0m �[38;5;28;01mdef�[39;00m �[38;5;21minner�[39m(�[38;5;241m
�[39margs, �[38;5;241m�[39m�[38;5;241m�[39mkwargs):
E �[1;32m 100�[0m libnvtx_push_range(�[38;5;28mself�[39m�[38;5;241m.�[39mattributes, �[38;5;28mself�[39m�[38;5;241m.�[39mdomain�[38;5;241m.�[39mhandle)
E �[0;32m--> 101�[0m result �[38;5;241m=�[39m �[43mfunc�[49m�[43m(�[49m�[38;5;241;43m�[39;49m�[43margs�[49m�[43m,�[49m�[43m �[49m�[38;5;241;43m�[39;49m�[38;5;241;43m�[39;49m�[43mkwargs�[49m�[43m)�[49m
E �[1;32m 102�[0m libnvtx_pop_range(�[38;5;28mself�[39m�[38;5;241m.�[39mdomain�[38;5;241m.�[39mhandle)
E �[1;32m 103�[0m �[38;5;28;01mreturn�[39;00m result
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/cudf/io/parquet.py:609�[0m, in �[0;36mto_parquet�[0;34m(df, path, engine, compression, index, partition_cols, partition_file_name, partition_offsets, statistics, metadata_file_path, int96_timestamps, row_group_size_bytes, row_group_size_rows, args, **kwargs)�[0m
E �[1;32m 601�[0m �[38;5;28;01mif�[39;00m partition_offsets:
E �[1;32m 602�[0m kwargs[�[38;5;124m"�[39m�[38;5;124mpartitions_info�[39m�[38;5;124m"�[39m] �[38;5;241m=�[39m �[38;5;28mlist�[39m(
E �[1;32m 603�[0m �[38;5;28mzip�[39m(
E �[1;32m 604�[0m partition_offsets,
E �[1;32m 605�[0m np�[38;5;241m.�[39mroll(partition_offsets, �[38;5;241m-�[39m�[38;5;241m1�[39m) �[38;5;241m-�[39m partition_offsets,
E �[1;32m 606�[0m )
E �[1;32m 607�[0m )[:�[38;5;241m-�[39m�[38;5;241m1�[39m]
E �[0;32m--> 609�[0m �[38;5;28;01mreturn�[39;00m �[43m_write_parquet�[49m�[43m(�[49m
E �[1;32m 610�[0m �[43m �[49m�[43mdf�[49m�[43m,�[49m
E �[1;32m 611�[0m �[43m �[49m�[43mpaths�[49m�[38;5;241;43m=�[39;49m�[43mpath�[49m�[43m �[49m�[38;5;28;43;01mif�[39;49;00m�[43m �[49m�[43mis_list_like�[49m�[43m(�[49m�[43mpath�[49m�[43m)�[49m�[43m �[49m�[38;5;28;43;01melse�[39;49;00m�[43m �[49m�[43m[�[49m�[43mpath�[49m�[43m]�[49m�[43m,�[49m
E �[1;32m 612�[0m �[43m �[49m�[43mcompression�[49m�[38;5;241;43m=�[39;49m�[43mcompression�[49m�[43m,�[49m
E �[1;32m 613�[0m �[43m �[49m�[43mindex�[49m�[38;5;241;43m=�[39;49m�[43mindex�[49m�[43m,�[49m
E �[1;32m 614�[0m �[43m �[49m�[43mstatistics�[49m�[38;5;241;43m=�[39;49m�[43mstatistics�[49m�[43m,�[49m
E �[1;32m 615�[0m �[43m �[49m�[43mmetadata_file_path�[49m�[38;5;241;43m=�[39;49m�[43mmetadata_file_path�[49m�[43m,�[49m
E �[1;32m 616�[0m �[43m �[49m�[43mint96_timestamps�[49m�[38;5;241;43m=�[39;49m�[43mint96_timestamps�[49m�[43m,�[49m
E �[1;32m 617�[0m �[43m �[49m�[43mrow_group_size_bytes�[49m�[38;5;241;43m=�[39;49m�[43mrow_group_size_bytes�[49m�[43m,�[49m
E �[1;32m 618�[0m �[43m �[49m�[43mrow_group_size_rows�[49m�[38;5;241;43m=�[39;49m�[43mrow_group_size_rows�[49m�[43m,�[49m
E �[1;32m 619�[0m �[43m �[49m�[38;5;241;43m
�[39;49m�[38;5;241;43m
�[39;49m�[43mkwargs�[49m�[43m,�[49m
E �[1;32m 620�[0m �[43m �[49m�[43m)�[49m
E �[1;32m 622�[0m �[38;5;28;01melse�[39;00m:
E �[1;32m 623�[0m �[38;5;28;01mif�[39;00m partition_offsets �[38;5;129;01mis�[39;00m �[38;5;129;01mnot�[39;00m �[38;5;28;01mNone�[39;00m:
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/nvtx/nvtx.py:101�[0m, in �[0;36mannotate.call..inner�[0;34m(args, **kwargs)�[0m
E �[1;32m 98�[0m �[38;5;129m@wraps�[39m(func)
E �[1;32m 99�[0m �[38;5;28;01mdef�[39;00m �[38;5;21minner�[39m(�[38;5;241m
�[39margs, �[38;5;241m�[39m�[38;5;241m�[39mkwargs):
E �[1;32m 100�[0m libnvtx_push_range(�[38;5;28mself�[39m�[38;5;241m.�[39mattributes, �[38;5;28mself�[39m�[38;5;241m.�[39mdomain�[38;5;241m.�[39mhandle)
E �[0;32m--> 101�[0m result �[38;5;241m=�[39m �[43mfunc�[49m�[43m(�[49m�[38;5;241;43m�[39;49m�[43margs�[49m�[43m,�[49m�[43m �[49m�[38;5;241;43m�[39;49m�[38;5;241;43m�[39;49m�[43mkwargs�[49m�[43m)�[49m
E �[1;32m 102�[0m libnvtx_pop_range(�[38;5;28mself�[39m�[38;5;241m.�[39mdomain�[38;5;241m.�[39mhandle)
E �[1;32m 103�[0m �[38;5;28;01mreturn�[39;00m result
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/cudf/io/parquet.py:69�[0m, in �[0;36m_write_parquet�[0;34m(df, paths, compression, index, statistics, metadata_file_path, int96_timestamps, row_group_size_bytes, row_group_size_rows, partitions_info, **kwargs)�[0m
E �[1;32m 65�[0m write_parquet_res �[38;5;241m=�[39m libparquet�[38;5;241m.�[39mwrite_parquet(
E �[1;32m 66�[0m df, filepaths_or_buffers�[38;5;241m=�[39mfile_objs, �[38;5;241m
�[39m�[38;5;241m�[39mcommon_args
E �[1;32m 67�[0m )
E �[1;32m 68�[0m �[38;5;28;01melse�[39;00m:
E �[0;32m---> 69�[0m write_parquet_res �[38;5;241m=�[39m �[43mlibparquet�[49m�[38;5;241;43m.�[39;49m�[43mwrite_parquet�[49m�[43m(�[49m
E �[1;32m 70�[0m �[43m �[49m�[43mdf�[49m�[43m,�[49m�[43m �[49m�[43mfilepaths_or_buffers�[49m�[38;5;241;43m=�[39;49m�[43mpaths_or_bufs�[49m�[43m,�[49m�[43m �[49m�[38;5;241;43m
�[39;49m�[38;5;241;43m*�[39;49m�[43mcommon_args�[49m
E �[1;32m 71�[0m �[43m �[49m�[43m)�[49m
E �[1;32m 73�[0m �[38;5;28;01mreturn�[39;00m write_parquet_res
E
E File �[0;32mcudf/_lib/parquet.pyx:287�[0m, in �[0;36mcudf._lib.parquet.write_parquet�[0;34m()�[0m
E
E File �[0;32mcudf/_lib/parquet.pyx:397�[0m, in �[0;36mcudf._lib.parquet.write_parquet�[0;34m()�[0m
E
E �[0;31mMemoryError�[0m: std::bad_alloc: out_of_memory: CUDA error at: /usr/include/rmm/mr/device/cuda_memory_resource.hpp:70: cudaErrorMemoryAllocation out of memory
E MemoryError: std::bad_alloc: out_of_memory: CUDA error at: /usr/include/rmm/mr/device/cuda_memory_resource.hpp:70: cudaErrorMemoryAllocation out of memory

/usr/local/lib/python3.8/dist-packages/nbclient/client.py:919: CellExecutionError
----------------------------- Captured stderr call -----------------------------
2022-08-25 17:32:11,289 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-08-25 17:32:11,319 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
/usr/lib/python3.8/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 30 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

../../../.local/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info ============================
FAILED tests/unit/examples/test_scaling_criteo_merlin_models.py::test_func - ...
============= 1 failed, 1 passed, 1 skipped, 35 warnings in 25.26s =============
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/Merlin/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[merlin_merlin] $ /bin/bash /tmp/jenkins3318234066103153164.sh

@jperez999
Copy link
Collaborator Author

rerun tests

1 similar comment
@jperez999
Copy link
Collaborator Author

rerun tests

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #561 of commit f390f3aca3d0344033d7e7c0a8ff8cec991c8b68, no merge conflicts.
Running as SYSTEM
Setting status of f390f3aca3d0344033d7e7c0a8ff8cec991c8b68 to PENDING with url https://10.20.13.93:8080/job/merlin_merlin/375/console and message: 'Pending'
Using context: Jenkins
Building on master in workspace /var/jenkins_home/workspace/merlin_merlin
using credential systems-login
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/Merlin # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/Merlin
 > git --version # timeout=10
using GIT_ASKPASS to set credentials login for merlin-systems
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/Merlin +refs/pull/561/*:refs/remotes/origin/pr/561/* # timeout=10
 > git rev-parse f390f3aca3d0344033d7e7c0a8ff8cec991c8b68^{commit} # timeout=10
Checking out Revision f390f3aca3d0344033d7e7c0a8ff8cec991c8b68 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f390f3aca3d0344033d7e7c0a8ff8cec991c8b68 # timeout=10
Commit message: "Merge branch 'main' into add-skip-tf-crit-unit"
 > git rev-list --no-walk f390f3aca3d0344033d7e7c0a8ff8cec991c8b68 # timeout=10
[merlin_merlin] $ /bin/bash /tmp/jenkins16411461142552130985.sh
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.2, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/merlin_merlin/merlin
plugins: anyio-3.6.1, xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 3 items

tests/unit/test_version.py . [ 33%]
tests/unit/examples/test_building_deploying_multi_stage_RecSys.py s [ 66%]
tests/unit/examples/test_scaling_criteo_merlin_models.py . [100%]

=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

../../../.local/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============ 2 passed, 1 skipped, 35 warnings in 110.73s (0:01:50) =============
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/Merlin/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[merlin_merlin] $ /bin/bash /tmp/jenkins11969332289813395111.sh

@jperez999 jperez999 merged commit f2578d3 into NVIDIA-Merlin:main Aug 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
chore Infrastructure update ci
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants