Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove hugectr test #771

Merged
merged 2 commits into from
Dec 20, 2022
Merged

Conversation

jperez999
Copy link
Collaborator

This PR removes the test that currently fails with an incorrect data type between data generated by merlin and hugectr. This is currently unnecessary so we are removing. Hugectr has their own data generation and ensured it can consume the expected data type correctly.

@karlhigley karlhigley added chore Infrastructure update ci 22.12 labels Dec 20, 2022
@karlhigley karlhigley added this to the Merlin 22.12 milestone Dec 20, 2022
@github-actions
Copy link

Documentation preview

https://nvidia-merlin.github.io/Merlin/review/pr-771

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #771 of commit 4b000d14ad86113507e68970929a4bdcb7107abf, no merge conflicts.
Running as SYSTEM
Setting status of 4b000d14ad86113507e68970929a4bdcb7107abf to PENDING with url http://merlin-infra1.nvidia.com:8080/job/merlin_merlin/705/ and message: 'Pending'
Using context: Jenkins
Building on the built-in node in workspace /var/jenkins_home/jobs/merlin_merlin/workspace
using credential systems-login
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/Merlin # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/Merlin
 > git --version # timeout=10
using GIT_ASKPASS to set credentials login for merlin-systems
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/Merlin +refs/pull/771/*:refs/remotes/origin/pr/771/* # timeout=10
 > git rev-parse 4b000d14ad86113507e68970929a4bdcb7107abf^{commit} # timeout=10
Checking out Revision 4b000d14ad86113507e68970929a4bdcb7107abf (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4b000d14ad86113507e68970929a4bdcb7107abf # timeout=10
Commit message: "remove hugectr test"
 > git rev-list --no-walk 9d6ed0540beb47608af1d13c8dac8342d11293d4 # timeout=10
[workspace] $ /bin/bash /tmp/jenkins7008387208594834573.sh
GLOB sdist-make: /var/jenkins_home/workspace/merlin_merlin/merlin/setup.py
test-gpu recreate: /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/.tmp/package/1/merlin-0.0.1.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu installed: absl-py==1.2.0,aiohttp==3.8.1,aiosignal==1.2.0,alabaster==0.7.12,alembic==1.9.0,anyio==3.6.1,argon2-cffi==21.3.0,argon2-cffi-bindings==21.2.0,astroid==2.5.6,asttokens==2.0.8,astunparse==1.6.3,asv==0.5.1,asvdb==0.4.2,async-timeout==4.0.2,attrs==22.1.0,autopage==0.5.1,awscli==1.27.33,Babel==2.10.3,backcall==0.2.0,beautifulsoup4==4.11.1,betterproto==1.2.5,black==22.6.0,bleach==5.0.1,boto3==1.24.75,botocore==1.29.33,Brotli==1.0.9,cachetools==5.2.0,certifi==2019.11.28,cffi==1.15.1,chardet==3.0.4,charset-normalizer==2.1.1,clang==5.0,click==8.1.3,cliff==4.1.0,cloudpickle==2.2.0,cmaes==0.9.0,cmake==3.24.1.1,cmd2==2.4.2,colorama==0.4.4,colorlog==6.7.0,contourpy==1.0.5,coverage==7.0.0,cuda-python==11.7.1,cupy-cuda117==10.6.0,cycler==0.11.0,Cython==0.29.32,dask==2022.1.1,dbus-python==1.2.16,debugpy==1.6.3,decorator==5.1.1,defusedxml==0.7.1,dill==0.3.5.1,distlib==0.3.6,distributed==2022.5.1,distro==1.7.0,dm-tree==0.1.6,docker-pycreds==0.4.0,docutils==0.16,emoji==1.7.0,entrypoints==0.4,execnet==1.9.0,executing==1.0.0,faiss==1.7.2,faiss-gpu==1.7.2,fastai==2.7.9,fastapi==0.85.0,fastavro==1.6.1,fastcore==1.5.27,fastdownload==0.0.7,fastjsonschema==2.16.1,fastprogress==1.0.3,fastrlock==0.8,feast==0.19.4,fiddle==0.2.2,filelock==3.8.0,flatbuffers==1.12,fonttools==4.37.3,frozenlist==1.3.1,fsspec==2022.5.0,gast==0.4.0,gevent==21.12.0,geventhttpclient==2.0.2,gitdb==4.0.9,GitPython==3.1.27,google==3.0.0,google-api-core==2.10.1,google-auth==2.11.1,google-auth-oauthlib==0.4.6,google-pasta==0.2.0,googleapis-common-protos==1.52.0,graphviz==0.20.1,greenlet==1.1.3,grpcio==1.41.0,grpcio-channelz==1.49.0,grpcio-reflection==1.48.1,grpclib==0.4.3,h11==0.13.0,h2==4.1.0,h5py==3.7.0,HeapDict==1.0.1,horovod==0.26.1,hpack==4.0.0,httptools==0.5.0,hugectr2onnx==0.0.0,huggingface-hub==0.9.1,hyperframe==6.0.1,idna==2.8,imagesize==1.4.1,implicit==0.6.1,importlib-metadata==4.12.0,importlib-resources==5.9.0,iniconfig==1.1.1,ipykernel==6.15.3,ipython==8.5.0,ipython-genutils==0.2.0,ipywidgets==7.7.0,jedi==0.18.1,Jinja2==3.1.2,jmespath==1.0.1,joblib==1.2.0,json5==0.9.10,jsonschema==4.16.0,jupyter-cache==0.4.3,jupyter-core==4.11.1,jupyter-server==1.18.1,jupyter-server-mathjax==0.2.5,jupyter-sphinx==0.3.2,jupyter_client==7.3.5,jupyterlab==3.4.7,jupyterlab-pygments==0.2.2,jupyterlab-widgets==1.1.0,jupyterlab_server==2.15.1,keras==2.9.0,Keras-Preprocessing==1.1.2,kiwisolver==1.4.4,lazy-object-proxy==1.8.0,libclang==14.0.6,libcst==0.4.7,lightfm==1.16,lightgbm==3.3.2,linkify-it-py==1.0.3,llvmlite==0.39.1,locket==1.0.0,lxml==4.9.1,Mako==1.2.4,Markdown==3.4.1,markdown-it-py==1.1.0,MarkupSafe==2.1.1,matplotlib==3.6.0,matplotlib-inline==0.1.6,mdit-py-plugins==0.2.8,merlin==0.0.1,merlin-core==0.6.0+1.g5926fcf,merlin-models==0.7.0+11.g280956aa4,merlin-systems==0.5.0+4.g15074ad,mistune==2.0.4,mmh3==3.0.0,mpi4py==3.1.3,msgpack==1.0.4,multidict==6.0.2,mypy-extensions==0.4.3,myst-nb==0.13.2,myst-parser==0.15.2,natsort==8.1.0,nbclassic==0.4.3,nbclient==0.6.8,nbconvert==7.0.0,nbdime==3.1.1,nbformat==5.5.0,nest-asyncio==1.5.5,ninja==1.10.2.3,notebook==6.4.12,notebook-shim==0.1.0,numba==0.56.2,numpy==1.22.4,nvidia-pyindex==1.0.9,# Editable install with no version control (nvtabular==1.4.0+8.g95e12d347),-e /usr/local/lib/python3.8/dist-packages,nvtx==0.2.5,oauthlib==3.2.1,oldest-supported-numpy==2022.8.16,onnx==1.12.0,onnxruntime==1.11.1,opt-einsum==3.3.0,optuna==3.0.5,packaging==21.3,pandas==1.3.5,pandavro==1.5.2,pandocfilters==1.5.0,parso==0.8.3,partd==1.3.0,pathtools==0.1.2,pbr==5.11.0,pexpect==4.8.0,pickleshare==0.7.5,Pillow==9.2.0,pkgutil_resolve_name==1.3.10,platformdirs==2.5.2,plotly==5.11.0,pluggy==1.0.0,prettytable==3.5.0,prometheus-client==0.14.1,promise==2.3,prompt-toolkit==3.0.31,proto-plus==1.19.6,protobuf==3.19.5,psutil==5.9.2,ptyprocess==0.7.0,pure-eval==0.2.2,py==1.11.0,pyarrow==7.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pybind11==2.10.0,pycparser==2.21,pydantic==1.10.2,pydot==1.4.2,Pygments==2.13.0,PyGObject==3.36.0,pynvml==11.4.1,pyparsing==3.0.9,pyperclip==1.8.2,pyrsistent==0.18.1,pytest==7.1.3,pytest-cov==4.0.0,pytest-xdist==3.1.0,python-apt==2.0.0+ubuntu0.20.4.8,python-dateutil==2.8.2,python-dotenv==0.21.0,python-rapidjson==1.8,pytz==2022.2.1,PyYAML==5.4.1,pyzmq==24.0.0,regex==2022.9.13,requests==2.22.0,requests-oauthlib==1.3.1,requests-unixsocket==0.2.0,rsa==4.7.2,s3fs==2022.2.0,s3transfer==0.6.0,sacremoses==0.0.53,scikit-build==0.15.0,scikit-learn==1.1.2,scipy==1.8.1,seedir==0.3.0,Send2Trash==1.8.0,sentry-sdk==1.9.8,setproctitle==1.3.2,setuptools-scm==7.0.5,shortuuid==1.0.9,six==1.15.0,sklearn==0.0,smmap==5.0.0,sniffio==1.3.0,snowballstemmer==2.2.0,sortedcontainers==2.4.0,soupsieve==2.3.2.post1,Sphinx==5.3.0,sphinx-multiversion==0.2.4,sphinx-togglebutton==0.3.1,sphinx_external_toc==0.3.0,sphinxcontrib-applehelp==1.0.2,sphinxcontrib-copydirs @ git+https://github.com/mikemckiernan/sphinxcontrib-copydirs.git@bd8c5d79b3f91cf5f1bb0d6995aeca3fe84b670e,sphinxcontrib-devhelp==1.0.2,sphinxcontrib-htmlhelp==2.0.0,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-qthelp==1.0.3,sphinxcontrib-serializinghtml==1.1.5,SQLAlchemy==1.4.45,stack-data==0.5.0,starlette==0.20.4,stevedore==4.1.1,stringcase==1.2.0,supervisor==4.1.0,tabulate==0.8.10,tblib==1.7.0,tdqm==0.0.1,tenacity==8.0.1,tensorboard==2.9.1,tensorboard-data-server==0.6.1,tensorboard-plugin-wit==1.8.1,tensorflow==2.9.2,tensorflow-estimator==2.9.0,tensorflow-gpu==2.9.2,tensorflow-io-gcs-filesystem==0.27.0,tensorflow-metadata==1.10.0,termcolor==2.0.1,terminado==0.15.0,testbook==0.4.2,threadpoolctl==3.1.0,tinycss2==1.1.1,tokenizers==0.10.3,toml==0.10.2,tomli==2.0.1,toolz==0.12.0,torch==1.12.1+cu113,torchmetrics==0.3.2,tornado==6.2,tox==3.26.0,tqdm==4.64.1,traitlets==5.4.0,transformers==4.12.0,transformers4rec==0.1.12+2.gbcc939255,treelite==2.3.0,treelite-runtime==2.3.0,tritonclient==2.25.0,typing-inspect==0.8.0,typing_extensions==4.3.0,uc-micro-py==1.0.1,urllib3==1.26.12,uvicorn==0.18.3,uvloop==0.17.0,versioneer==0.20,virtualenv==20.16.5,wandb==0.13.3,watchfiles==0.17.0,wcwidth==0.2.5,webencodings==0.5.1,websocket-client==1.4.1,websockets==10.3,Werkzeug==2.2.2,widgetsnbextension==3.6.0,wrapt==1.12.1,xgboost==1.6.2,yarl==1.8.1,zict==2.2.0,zipp==3.8.1,zope.event==4.5.0,zope.interface==5.4.0
test-gpu run-test-pre: PYTHONHASHSEED='4116269518'
test-gpu run-test: commands[0] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/systems.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/systems.git
  Cloning https://github.com/NVIDIA-Merlin/systems.git to /tmp/pip-req-build-ezf_067d
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/systems.git /tmp/pip-req-build-ezf_067d
  Resolved https://github.com/NVIDIA-Merlin/systems.git to commit df53c726ce5b90c00e1310cb80570a8622f8f785
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting treelite-runtime==2.4.0
  Downloading treelite_runtime-2.4.0-py3-none-manylinux2014_x86_64.whl (191 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 191.5/191.5 kB 3.3 MB/s eta 0:00:00
Requirement already satisfied: merlin-core>=0.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-systems==0.7.0+39.gdf53c72) (0.3.0+12.g78ecddd)
Collecting treelite==2.4.0
  Downloading treelite-2.4.0-py3-none-manylinux2014_x86_64.whl (852 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 852.7/852.7 kB 16.5 MB/s eta 0:00:00
Requirement already satisfied: nvtabular>=1.0.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-systems==0.7.0+39.gdf53c72) (1.1.1)
Requirement already satisfied: requests<3,>=2.10 in /usr/lib/python3/dist-packages (from merlin-systems==0.7.0+39.gdf53c72) (2.22.0)
Requirement already satisfied: numpy in /var/jenkins_home/.local/lib/python3.8/site-packages (from treelite==2.4.0->merlin-systems==0.7.0+39.gdf53c72) (1.20.3)
Requirement already satisfied: scipy in /usr/local/lib/python3.8/dist-packages (from treelite==2.4.0->merlin-systems==0.7.0+39.gdf53c72) (1.8.1)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (0.55.1)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (3.19.5)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.10.0)
Requirement already satisfied: distributed>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2022.3.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.3.5)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.2.5)
Requirement already satisfied: dask>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2022.3.0)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (7.0.0)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (4.64.1)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (21.3)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.2.0)
Requirement already satisfied: fsspec>=0.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2022.5.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (0.12.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.2.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.7.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.4.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.0.4)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.0.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (5.8.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (8.1.3)
Collecting llvmlite<0.39,>=0.38.0rc1
  Downloading llvmlite-0.38.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 34.5/34.5 MB 58.6 MB/s eta 0:00:00
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (65.6.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2022.2.1)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (6.0.1)
Building wheels for collected packages: merlin-systems
  Building wheel for merlin-systems (pyproject.toml): started
  Building wheel for merlin-systems (pyproject.toml): finished with status 'done'
  Created wheel for merlin-systems: filename=merlin_systems-0.7.0+39.gdf53c72-py3-none-any.whl size=100698 sha256=1a385fb1434e5f8078aed8048058529b417ffe03c2b564ca89b92a60017adcc1
  Stored in directory: /tmp/pip-ephem-wheel-cache-qu7528n3/wheels/d3/db/b8/99d510a979c278774eda4142f1c0643c93b7b2674aff321c16
Successfully built merlin-systems
Installing collected packages: llvmlite, treelite-runtime, treelite, merlin-systems
  Attempting uninstall: llvmlite
    Found existing installation: llvmlite 0.39.1
    Not uninstalling llvmlite at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'llvmlite'. No files were found to uninstall.
  Attempting uninstall: treelite-runtime
    Found existing installation: treelite-runtime 2.3.0
    Not uninstalling treelite-runtime at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'treelite-runtime'. No files were found to uninstall.
  Attempting uninstall: treelite
    Found existing installation: treelite 2.3.0
    Not uninstalling treelite at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'treelite'. No files were found to uninstall.
  Attempting uninstall: merlin-systems
    Found existing installation: merlin-systems 0.5.0+4.g15074ad
    Not uninstalling merlin-systems at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'merlin-systems'. No files were found to uninstall.
Successfully installed llvmlite-0.38.1 merlin-systems-0.7.0+39.gdf53c72 treelite-2.4.0 treelite-runtime-2.4.0
test-gpu run-test: commands[1] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/models.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/models.git
  Cloning https://github.com/NVIDIA-Merlin/models.git to /tmp/pip-req-build-fbzajarv
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/models.git /tmp/pip-req-build-fbzajarv
  Resolved https://github.com/NVIDIA-Merlin/models.git to commit d729f4a5df7b47574e702e539f0b9a3b8dce5e0d
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: merlin-core>=0.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-models==0.9.0+71.gd729f4a5) (0.3.0+12.g78ecddd)
Collecting merlin-dataloader>=0.0.2
  Downloading merlin-dataloader-0.0.3.tar.gz (48 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.3/48.3 kB 1.8 MB/s eta 0:00:00
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (0.55.1)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (3.19.5)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.10.0)
Requirement already satisfied: distributed>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (2022.3.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.3.5)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.2.5)
Requirement already satisfied: dask>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (2022.3.0)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (7.0.0)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (4.64.1)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (21.3)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (2.2.0)
Requirement already satisfied: fsspec>=0.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (2022.5.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (0.12.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.2.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.7.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (2.4.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.0.4)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (2.0.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (5.8.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (8.1.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (65.6.3)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (2022.2.1)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+71.gd729f4a5) (6.0.1)
Building wheels for collected packages: merlin-models, merlin-dataloader
  Building wheel for merlin-models (pyproject.toml): started
  Building wheel for merlin-models (pyproject.toml): finished with status 'done'
  Created wheel for merlin-models: filename=merlin_models-0.9.0+71.gd729f4a5-py3-none-any.whl size=357235 sha256=25fe7655c18fd8f82b5382c7bfbf0bb890832c0b269af975a55f604a4e45d0c3
  Stored in directory: /tmp/pip-ephem-wheel-cache-g6tcpywq/wheels/5a/43/99/d50fe2c33b4f4686db73207ce3865e0d6be6609ffb03abade5
  Building wheel for merlin-dataloader (pyproject.toml): started
  Building wheel for merlin-dataloader (pyproject.toml): finished with status 'done'
  Created wheel for merlin-dataloader: filename=merlin_dataloader-0.0.3-py3-none-any.whl size=37647 sha256=d01ec595d780b69a24db9109f0ecb60dc3e47e1c7a1c042c4dba7570b2b048b5
  Stored in directory: /tmp/pip-ephem-wheel-cache-g6tcpywq/wheels/1c/a3/4a/0feebb30e0c8cb7ba7046544390b43c7017a2195232f5305a1
Successfully built merlin-models merlin-dataloader
Installing collected packages: merlin-dataloader, merlin-models
  Attempting uninstall: merlin-models
    Found existing installation: merlin-models 0.7.0+11.g280956aa4
    Not uninstalling merlin-models at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'merlin-models'. No files were found to uninstall.
Successfully installed merlin-dataloader-0.0.3 merlin-models-0.9.0+71.gd729f4a5
test-gpu run-test: commands[2] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/NVTabular.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/NVTabular.git
  Cloning https://github.com/NVIDIA-Merlin/NVTabular.git to /tmp/pip-req-build-nltatb6t
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/NVTabular.git /tmp/pip-req-build-nltatb6t
  Resolved https://github.com/NVIDIA-Merlin/NVTabular.git to commit 985510ef0f529aa54a8ac29414d3ed71542c4c62
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: scipy in /usr/local/lib/python3.8/dist-packages (from nvtabular==1.6.0+22.g985510ef) (1.8.1)
Requirement already satisfied: merlin-core>=0.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from nvtabular==1.6.0+22.g985510ef) (0.3.0+12.g78ecddd)
Requirement already satisfied: merlin-dataloader>=0.0.2 in ./.tox/test-gpu/lib/python3.8/site-packages (from nvtabular==1.6.0+22.g985510ef) (0.0.3)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.55.1)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (3.19.5)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.10.0)
Requirement already satisfied: distributed>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2022.3.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.3.5)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.2.5)
Requirement already satisfied: dask>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2022.3.0)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (7.0.0)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (4.64.1)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (21.3)
Requirement already satisfied: numpy<1.25.0,>=1.17.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (from scipy->nvtabular==1.6.0+22.g985510ef) (1.20.3)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.2.0)
Requirement already satisfied: fsspec>=0.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2022.5.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.12.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.2.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.7.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.4.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.0.4)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.0.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (5.8.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (8.1.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (65.6.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2022.2.1)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (6.0.1)
Building wheels for collected packages: nvtabular
  Building wheel for nvtabular (pyproject.toml): started
  Building wheel for nvtabular (pyproject.toml): finished with status 'done'
  Created wheel for nvtabular: filename=nvtabular-1.6.0+22.g985510ef-cp38-cp38-linux_x86_64.whl size=257120 sha256=b7fbb94136bd18fb7b5d4ab187216fd319335647552c7d0244fffb406a519fb9
  Stored in directory: /tmp/pip-ephem-wheel-cache-4u7_0uz3/wheels/c2/16/76/39994bff39d812513de5b5572bff0903b9eb8f6c645b44cedc
Successfully built nvtabular
Installing collected packages: nvtabular
  Attempting uninstall: nvtabular
    Found existing installation: nvtabular 1.1.1
    Not uninstalling nvtabular at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'nvtabular'. No files were found to uninstall.
Successfully installed nvtabular-1.6.0+22.g985510ef
test-gpu run-test: commands[3] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/core.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/core.git
  Cloning https://github.com/NVIDIA-Merlin/core.git to /tmp/pip-req-build-8pr4yhe4
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/core.git /tmp/pip-req-build-8pr4yhe4
  Resolved https://github.com/NVIDIA-Merlin/core.git to commit 73c2afc277967015e2f334b8c57a9e8789fa19bd
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (0.55.1)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (3.19.5)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (2022.3.0)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (2022.5.0)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (1.10.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (1.3.5)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (1.2.5)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (2022.3.0)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (7.0.0)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (4.64.1)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (21.3)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (2.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (0.12.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (1.2.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (1.7.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (2.4.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (1.0.4)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (2.0.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (5.8.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (8.1.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.9.0+19.g73c2afc) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.9.0+19.g73c2afc) (65.6.3)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.9.0+19.g73c2afc) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core==0.9.0+19.g73c2afc) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (2022.2.1)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (6.0.1)
Building wheels for collected packages: merlin-core
  Building wheel for merlin-core (pyproject.toml): started
  Building wheel for merlin-core (pyproject.toml): finished with status 'done'
  Created wheel for merlin-core: filename=merlin_core-0.9.0+19.g73c2afc-py3-none-any.whl size=119766 sha256=ba238bd700abca45e86d5486c38110c84d0258192d0126c30a3534e81b38b912
  Stored in directory: /tmp/pip-ephem-wheel-cache-0q4nuv4r/wheels/c8/38/16/a6968787eafcec5fa772148af8408b089562f71af0752e8e84
Successfully built merlin-core
Installing collected packages: merlin-core
  Attempting uninstall: merlin-core
    Found existing installation: merlin-core 0.3.0+12.g78ecddd
    Not uninstalling merlin-core at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'merlin-core'. No files were found to uninstall.
Successfully installed merlin-core-0.9.0+19.g73c2afc
test-gpu run-test: commands[4] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/dataloader.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/dataloader.git
  Cloning https://github.com/NVIDIA-Merlin/dataloader.git to /tmp/pip-req-build-ut40rebq
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/dataloader.git /tmp/pip-req-build-ut40rebq
  Resolved https://github.com/NVIDIA-Merlin/dataloader.git to commit d4e6c1bd9eaaaaca87b336a5da835cb5e0bc5df3
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: merlin-core>=0.8.0 in ./.tox/test-gpu/lib/python3.8/site-packages (from merlin-dataloader==0.0.2+28.gd4e6c1b) (0.9.0+19.g73c2afc)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.55.1)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (3.19.5)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2022.3.0)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2022.5.0)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.10.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.3.5)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.2.5)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2022.3.0)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (7.0.0)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (4.64.1)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (21.3)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.2.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.12.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.2.0)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.7.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.4.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.0.4)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.0.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (5.8.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (8.1.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (65.6.3)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2022.2.1)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.0.1)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (4.1.0)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (6.0.2)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (6.0.1)
Building wheels for collected packages: merlin-dataloader
  Building wheel for merlin-dataloader (pyproject.toml): started
  Building wheel for merlin-dataloader (pyproject.toml): finished with status 'done'
  Created wheel for merlin-dataloader: filename=merlin_dataloader-0.0.2+28.gd4e6c1b-py3-none-any.whl size=41443 sha256=ea614c5b58e155d39d1de875ffac0f064c8ce93e57bfb99c9bf63e28c7a0c168
  Stored in directory: /tmp/pip-ephem-wheel-cache-ax5ihanb/wheels/de/f5/d9/251909f4627d2920fb15548f5ffd6daf1bf24c3c56bb4977b1
Successfully built merlin-dataloader
Installing collected packages: merlin-dataloader
  Attempting uninstall: merlin-dataloader
    Found existing installation: merlin-dataloader 0.0.3
    Uninstalling merlin-dataloader-0.0.3:
      Successfully uninstalled merlin-dataloader-0.0.3
Successfully installed merlin-dataloader-0.0.2+28.gd4e6c1b
test-gpu run-test: commands[5] | python -m pytest --cov-report term --cov merlin -rxs tests/unit
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.3, pluggy-1.0.0
cachedir: .tox/test-gpu/.pytest_cache
rootdir: /var/jenkins_home/workspace/merlin_merlin/merlin
plugins: anyio-3.5.0, cov-4.0.0, xdist-3.1.0
collected 6 items / 1 skipped

tests/unit/test_version.py . [ 16%]
tests/unit/examples/test_building_deploying_multi_stage_RecSys.py F [ 33%]
tests/unit/examples/test_scaling_criteo_merlin_models.py F [ 50%]
tests/unit/examples/test_scaling_criteo_optimize_notebook.py . [ 66%]
tests/unit/examples/test_z_legacy_notebooks.py .. [100%]

=================================== FAILURES ===================================
__________________________________ test_func ___________________________________

self = <testbook.client.TestbookNotebookClient object at 0x7f62aff4b4f0>
cell = [55], kwargs = {}, cell_indexes = [55], executed_cells = [], idx = 55

def execute_cell(self, cell, **kwargs) -> Union[Dict, List[Dict]]:
    """
    Executes a cell or list of cells
    """
    if isinstance(cell, slice):
        start, stop = self._cell_index(cell.start), self._cell_index(cell.stop)
        if cell.step is not None:
            raise TestbookError('testbook does not support step argument')

        cell = range(start, stop + 1)
    elif isinstance(cell, str) or isinstance(cell, int):
        cell = [cell]

    cell_indexes = cell

    if all(isinstance(x, str) for x in cell):
        cell_indexes = [self._cell_index(tag) for tag in cell]

    executed_cells = []
    for idx in cell_indexes:
        try:
          cell = super().execute_cell(self.nb['cells'][idx], idx, **kwargs)

../../../.local/lib/python3.8/site-packages/testbook/client.py:133:


args = (<testbook.client.TestbookNotebookClient object at 0x7f62aff4b4f0>, {'id': 'cc39a6eb', 'cell_type': 'code', 'metadata'...[38;5;241m.\x1b[39mis_server_ready()\n', '\x1b[0;31mRuntimeError\x1b[0m: Tritonserver failed to start (ret=1)']}]}, 55)
kwargs = {}

def wrapped(*args, **kwargs):
  return just_run(coro(*args, **kwargs))

../../../.local/lib/python3.8/site-packages/nbclient/util.py:84:


coro = <coroutine object NotebookClient.async_execute_cell at 0x7f62af6b6dc0>

def just_run(coro: Awaitable) -> Any:
    """Make the coroutine run, even if there is an event loop running (using nest_asyncio)"""
    # original from vaex/asyncio.py
    loop = asyncio._get_running_loop()
    if loop is None:
        had_running_loop = False
        try:
            loop = asyncio.get_event_loop()
        except RuntimeError:
            # we can still get 'There is no current event loop in ...'
            loop = asyncio.new_event_loop()
            asyncio.set_event_loop(loop)
    else:
        had_running_loop = True
    if had_running_loop:
        # if there is a running loop, we patch using nest_asyncio
        # to have reentrant event loops
        check_ipython()
        import nest_asyncio

        nest_asyncio.apply()
        check_patch_tornado()
  return loop.run_until_complete(coro)

../../../.local/lib/python3.8/site-packages/nbclient/util.py:62:


self = <_UnixSelectorEventLoop running=False closed=False debug=False>
future = <Task finished name='Task-377' coro=<NotebookClient.async_execute_cell() done, defined at /var/jenkins_home/.local/lib...0;31mRuntimeError\x1b[0m: Tritonserver failed to start (ret=1)\nRuntimeError: Tritonserver failed to start (ret=1)\n')>

def run_until_complete(self, future):
    """Run until the Future is done.

    If the argument is a coroutine, it is wrapped in a Task.

    WARNING: It would be disastrous to call run_until_complete()
    with the same coroutine twice -- it would wrap it in two
    different Tasks and that can't be good.

    Return the Future's result, or raise its exception.
    """
    self._check_closed()
    self._check_running()

    new_task = not futures.isfuture(future)
    future = tasks.ensure_future(future, loop=self)
    if new_task:
        # An exception is raised if the future didn't complete, so there
        # is no need to log the "destroy pending task" message
        future._log_destroy_pending = False

    future.add_done_callback(_run_until_complete_cb)
    try:
        self.run_forever()
    except:
        if new_task and future.done() and not future.cancelled():
            # The coroutine raised a BaseException. Consume the exception
            # to not log a warning, the caller doesn't have access to the
            # local task.
            future.exception()
        raise
    finally:
        future.remove_done_callback(_run_until_complete_cb)
    if not future.done():
        raise RuntimeError('Event loop stopped before Future completed.')
  return future.result()

/usr/lib/python3.8/asyncio/base_events.py:616:


self = <testbook.client.TestbookNotebookClient object at 0x7f62aff4b4f0>
cell = {'id': 'cc39a6eb', 'cell_type': 'code', 'metadata': {'execution': {'iopub.status.busy': '2022-12-20T18:20:50.448147Z',...t\x1b[38;5;241m.\x1b[39mis_server_ready()\n', '\x1b[0;31mRuntimeError\x1b[0m: Tritonserver failed to start (ret=1)']}]}
cell_index = 55, execution_count = None, store_history = True

async def async_execute_cell(
    self,
    cell: NotebookNode,
    cell_index: int,
    execution_count: t.Optional[int] = None,
    store_history: bool = True,
) -> NotebookNode:
    """
    Executes a single code cell.

    To execute all cells see :meth:`execute`.

    Parameters
    ----------
    cell : nbformat.NotebookNode
        The cell which is currently being processed.
    cell_index : int
        The position of the cell within the notebook object.
    execution_count : int
        The execution count to be assigned to the cell (default: Use kernel response)
    store_history : bool
        Determines if history should be stored in the kernel (default: False).
        Specific to ipython kernels, which can store command histories.

    Returns
    -------
    output : dict
        The execution output payload (or None for no output).

    Raises
    ------
    CellExecutionError
        If execution failed and should raise an exception, this will be raised
        with defaults about the failure.

    Returns
    -------
    cell : NotebookNode
        The cell which was just processed.
    """
    assert self.kc is not None

    await run_hook(self.on_cell_start, cell=cell, cell_index=cell_index)

    if cell.cell_type != 'code' or not cell.source.strip():
        self.log.debug("Skipping non-executing cell %s", cell_index)
        return cell

    if self.skip_cells_with_tag in cell.metadata.get("tags", []):
        self.log.debug("Skipping tagged cell %s", cell_index)
        return cell

    if self.record_timing:  # clear execution metadata prior to execution
        cell['metadata']['execution'] = {}

    self.log.debug("Executing cell:\n%s", cell.source)

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors or "raises-exception" in cell.metadata.get("tags", [])
    )

    await run_hook(self.on_cell_execute, cell=cell, cell_index=cell_index)
    parent_msg_id = await ensure_async(
        self.kc.execute(
            cell.source, store_history=store_history, stop_on_error=not cell_allows_errors
        )
    )
    await run_hook(self.on_cell_complete, cell=cell, cell_index=cell_index)
    # We launched a code cell to execute
    self.code_cells_executed += 1
    exec_timeout = self._get_timeout(cell)

    cell.outputs = []
    self.clear_before_next_output = False

    task_poll_kernel_alive = asyncio.ensure_future(self._async_poll_kernel_alive())
    task_poll_output_msg = asyncio.ensure_future(
        self._async_poll_output_msg(parent_msg_id, cell, cell_index)
    )
    self.task_poll_for_reply = asyncio.ensure_future(
        self._async_poll_for_reply(
            parent_msg_id, cell, exec_timeout, task_poll_output_msg, task_poll_kernel_alive
        )
    )
    try:
        exec_reply = await self.task_poll_for_reply
    except asyncio.CancelledError:
        # can only be cancelled by task_poll_kernel_alive when the kernel is dead
        task_poll_output_msg.cancel()
        raise DeadKernelError("Kernel died")
    except Exception as e:
        # Best effort to cancel request if it hasn't been resolved
        try:
            # Check if the task_poll_output is doing the raising for us
            if not isinstance(e, CellControlSignal):
                task_poll_output_msg.cancel()
        finally:
            raise

    if execution_count:
        cell['execution_count'] = execution_count
  await self._check_raise_for_error(cell, cell_index, exec_reply)

../../../.local/lib/python3.8/site-packages/nbclient/client.py:965:


self = <testbook.client.TestbookNotebookClient object at 0x7f62aff4b4f0>
cell = {'id': 'cc39a6eb', 'cell_type': 'code', 'metadata': {'execution': {'iopub.status.busy': '2022-12-20T18:20:50.448147Z',...t\x1b[38;5;241m.\x1b[39mis_server_ready()\n', '\x1b[0;31mRuntimeError\x1b[0m: Tritonserver failed to start (ret=1)']}]}
cell_index = 55
exec_reply = {'buffers': [], 'content': {'ename': 'RuntimeError', 'engine_info': {'engine_id': -1, 'engine_uuid': '74e9b766-ae60-4b...e, 'engine': '74e9b766-ae60-4b67-b97c-ebda4005eca2', 'started': '2022-12-20T18:20:50.448512Z', 'status': 'error'}, ...}

async def _check_raise_for_error(
    self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
) -> None:

    if exec_reply is None:
        return None

    exec_reply_content = exec_reply['content']
    if exec_reply_content['status'] != 'error':
        return None

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors
        or exec_reply_content.get('ename') in self.allow_error_names
        or "raises-exception" in cell.metadata.get("tags", [])
    )
    await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
    if not cell_allows_errors:
      raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)

E nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E
E import shutil
E from merlin.core.dispatch import get_lib
E from merlin.dataloader.tf_utils import configure_tensorflow
E configure_tensorflow()
E df_lib = get_lib()
E batch = df_lib.read_parquet(
E os.path.join("/tmp/data/processed_nvt/", "train", "part_0.parquet"),
E num_rows=1,
E columns=["user_id_raw"],
E )
E from merlin.systems.triton.utils import run_ensemble_on_tritonserver
E response = run_ensemble_on_tritonserver(
E "/tmp/examples/poc_ensemble", ensemble.graph.input_schema, batch, outputs, "ensemble_model"
E )
E response = [x.tolist()[0] for x in response["ordered_ids"]]
E shutil.rmtree("/tmp/examples/", ignore_errors=True)
E
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mRuntimeError�[0m Traceback (most recent call last)
E Cell �[0;32mIn [32], line 12�[0m
E �[1;32m 6�[0m batch �[38;5;241m=�[39m df_lib�[38;5;241m.�[39mread_parquet(
E �[1;32m 7�[0m os�[38;5;241m.�[39mpath�[38;5;241m.�[39mjoin(�[38;5;124m"�[39m�[38;5;124m/tmp/data/processed_nvt/�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mtrain�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mpart_0.parquet�[39m�[38;5;124m"�[39m),
E �[1;32m 8�[0m num_rows�[38;5;241m=�[39m�[38;5;241m1�[39m,
E �[1;32m 9�[0m columns�[38;5;241m=�[39m[�[38;5;124m"�[39m�[38;5;124muser_id_raw�[39m�[38;5;124m"�[39m],
E �[1;32m 10�[0m )
E �[1;32m 11�[0m �[38;5;28;01mfrom�[39;00m �[38;5;21;01mmerlin�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01msystems�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mtriton�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mutils�[39;00m �[38;5;28;01mimport�[39;00m run_ensemble_on_tritonserver
E �[0;32m---> 12�[0m response �[38;5;241m=�[39m �[43mrun_ensemble_on_tritonserver�[49m�[43m(�[49m
E �[1;32m 13�[0m �[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43m/tmp/examples/poc_ensemble�[39;49m�[38;5;124;43m"�[39;49m�[43m,�[49m�[43m �[49m�[43mensemble�[49m�[38;5;241;43m.�[39;49m�[43mgraph�[49m�[38;5;241;43m.�[39;49m�[43minput_schema�[49m�[43m,�[49m�[43m �[49m�[43mbatch�[49m�[43m,�[49m�[43m �[49m�[43moutputs�[49m�[43m,�[49m�[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43mensemble_model�[39;49m�[38;5;124;43m"�[39;49m
E �[1;32m 14�[0m �[43m)�[49m
E �[1;32m 15�[0m response �[38;5;241m=�[39m [x�[38;5;241m.�[39mtolist()[�[38;5;241m0�[39m] �[38;5;28;01mfor�[39;00m x �[38;5;129;01min�[39;00m response[�[38;5;124m"�[39m�[38;5;124mordered_ids�[39m�[38;5;124m"�[39m]]
E �[1;32m 16�[0m shutil�[38;5;241m.�[39mrmtree(�[38;5;124m"�[39m�[38;5;124m/tmp/examples/�[39m�[38;5;124m"�[39m, ignore_errors�[38;5;241m=�[39m�[38;5;28;01mTrue�[39;00m)
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:139�[0m, in �[0;36mrun_ensemble_on_tritonserver�[0;34m(tmpdir, schema, df, output_columns, model_name)�[0m
E �[1;32m 116�[0m �[38;5;124;03m"""Starts up a Triton server instance, loads up the ensemble model,�[39;00m
E �[1;32m 117�[0m �[38;5;124;03mprepares the inference request and returns the unparsed inference�[39;00m
E �[1;32m 118�[0m �[38;5;124;03mresponse.�[39;00m
E �[0;32m (...)�[0m
E �[1;32m 136�[0m �[38;5;124;03m the results of the prediction, parsed by output column name.�[39;00m
E �[1;32m 137�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 138�[0m response �[38;5;241m=�[39m �[38;5;28;01mNone�[39;00m
E �[0;32m--> 139�[0m �[38;5;28;01mwith�[39;00m run_triton_server(tmpdir) �[38;5;28;01mas�[39;00m client:
E �[1;32m 140�[0m response �[38;5;241m=�[39m send_triton_request(
E �[1;32m 141�[0m schema, df, output_columns, client�[38;5;241m=�[39mclient, triton_model�[38;5;241m=�[39mmodel_name
E �[1;32m 142�[0m )
E �[1;32m 144�[0m �[38;5;28;01mreturn�[39;00m response
E
E File �[0;32m/usr/lib/python3.8/contextlib.py:113�[0m, in �[0;36m_GeneratorContextManager.__enter__�[0;34m(self)�[0m
E �[1;32m 111�[0m �[38;5;28;01mdel�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39margs, �[38;5;28mself�[39m�[38;5;241m.�[39mkwds, �[38;5;28mself�[39m�[38;5;241m.�[39mfunc
E �[1;32m 112�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m--> 113�[0m �[38;5;28;01mreturn�[39;00m �[38;5;28;43mnext�[39;49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[38;5;241;43m.�[39;49m�[43mgen�[49m�[43m)�[49m
E �[1;32m 114�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mStopIteration�[39;00m:
E �[1;32m 115�[0m �[38;5;28;01mraise�[39;00m �[38;5;167;01mRuntimeError�[39;00m(�[38;5;124m"�[39m�[38;5;124mgenerator didn�[39m�[38;5;124m'�[39m�[38;5;124mt yield�[39m�[38;5;124m"�[39m) �[38;5;28;01mfrom�[39;00m �[38;5;28mNone�[39m
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:82�[0m, in �[0;36mrun_triton_server�[0;34m(model_repository, grpc_host, grpc_port, backend_config)�[0m
E �[1;32m 80�[0m �[38;5;28;01mif�[39;00m process�[38;5;241m.�[39mpoll() �[38;5;129;01mis�[39;00m �[38;5;129;01mnot�[39;00m �[38;5;28;01mNone�[39;00m:
E �[1;32m 81�[0m retcode �[38;5;241m=�[39m process�[38;5;241m.�[39mreturncode
E �[0;32m---> 82�[0m �[38;5;28;01mraise�[39;00m �[38;5;167;01mRuntimeError�[39;00m(�[38;5;124mf�[39m�[38;5;124m"�[39m�[38;5;124mTritonserver failed to start (ret=�[39m�[38;5;132;01m{�[39;00mretcode�[38;5;132;01m}�[39;00m�[38;5;124m)�[39m�[38;5;124m"�[39m)
E �[1;32m 84�[0m �[38;5;28;01mtry�[39;00m:
E �[1;32m 85�[0m ready �[38;5;241m=�[39m client�[38;5;241m.�[39mis_server_ready()
E
E �[0;31mRuntimeError�[0m: Tritonserver failed to start (ret=1)
E RuntimeError: Tritonserver failed to start (ret=1)

../../../.local/lib/python3.8/site-packages/nbclient/client.py:862: CellExecutionError

During handling of the above exception, another exception occurred:

def test_func():
    with testbook(
        REPO_ROOT
        / "examples"
        / "Building-and-deploying-multi-stage-RecSys"
        / "01-Building-Recommender-Systems-with-Merlin.ipynb",
        execute=False,
    ) as tb1:
        tb1.inject(
            """
            import os
            os.environ["DATA_FOLDER"] = "/tmp/data/"
            os.environ["NUM_ROWS"] = "100000"
            os.system("mkdir -p /tmp/examples")
            os.environ["BASE_DIR"] = "/tmp/examples/"
            """
        )
        tb1.execute()
        assert os.path.isdir("/tmp/examples/dlrm")
        assert os.path.isdir("/tmp/examples/feature_repo")
        assert os.path.isdir("/tmp/examples/query_tower")
        assert os.path.isfile("/tmp/examples/item_embeddings.parquet")
        assert os.path.isfile("/tmp/examples/feature_repo/user_features.py")
        assert os.path.isfile("/tmp/examples/feature_repo/item_features.py")

    with testbook(
        REPO_ROOT
        / "examples"
        / "Building-and-deploying-multi-stage-RecSys"
        / "02-Deploying-multi-stage-RecSys-with-Merlin-Systems.ipynb",
        execute=False,
        timeout=180,
    ) as tb2:
        tb2.inject(
            """
            import os
            os.environ["DATA_FOLDER"] = "/tmp/data/"
            os.environ["BASE_DIR"] = "/tmp/examples/"
            os.environ["topk_retrieval"] = "20"
            """
        )
        NUM_OF_CELLS = len(tb2.cells)
        tb2.execute_cell(list(range(0, NUM_OF_CELLS - 3)))
        top_k = tb2.ref("top_k")
        outputs = tb2.ref("outputs")
        assert outputs[0] == "ordered_ids"
      tb2.inject(
            """
            import shutil
            from merlin.core.dispatch import get_lib
            from merlin.dataloader.tf_utils import configure_tensorflow
            configure_tensorflow()
            df_lib = get_lib()
            batch = df_lib.read_parquet(
                os.path.join("/tmp/data/processed_nvt/", "train", "part_0.parquet"),
                num_rows=1,
                columns=["user_id_raw"],
            )
            from merlin.systems.triton.utils import run_ensemble_on_tritonserver
            response = run_ensemble_on_tritonserver(
                "/tmp/examples/poc_ensemble", ensemble.graph.input_schema, batch, outputs,  "ensemble_model"
            )
            response = [x.tolist()[0] for x in response["ordered_ids"]]
            shutil.rmtree("/tmp/examples/", ignore_errors=True)
            """
        )

tests/unit/examples/test_building_deploying_multi_stage_RecSys.py:61:


../../../.local/lib/python3.8/site-packages/testbook/client.py:237: in inject
cell = TestbookNode(self.execute_cell(inject_idx)) if run else TestbookNode(code_cell)


self = <testbook.client.TestbookNotebookClient object at 0x7f62aff4b4f0>
cell = [55], kwargs = {}, cell_indexes = [55], executed_cells = [], idx = 55

def execute_cell(self, cell, **kwargs) -> Union[Dict, List[Dict]]:
    """
    Executes a cell or list of cells
    """
    if isinstance(cell, slice):
        start, stop = self._cell_index(cell.start), self._cell_index(cell.stop)
        if cell.step is not None:
            raise TestbookError('testbook does not support step argument')

        cell = range(start, stop + 1)
    elif isinstance(cell, str) or isinstance(cell, int):
        cell = [cell]

    cell_indexes = cell

    if all(isinstance(x, str) for x in cell):
        cell_indexes = [self._cell_index(tag) for tag in cell]

    executed_cells = []
    for idx in cell_indexes:
        try:
            cell = super().execute_cell(self.nb['cells'][idx], idx, **kwargs)
        except CellExecutionError as ce:
          raise TestbookRuntimeError(ce.evalue, ce, self._get_error_class(ce.ename))

E testbook.exceptions.TestbookRuntimeError: An error occurred while executing the following cell:
E ------------------
E
E import shutil
E from merlin.core.dispatch import get_lib
E from merlin.dataloader.tf_utils import configure_tensorflow
E configure_tensorflow()
E df_lib = get_lib()
E batch = df_lib.read_parquet(
E os.path.join("/tmp/data/processed_nvt/", "train", "part_0.parquet"),
E num_rows=1,
E columns=["user_id_raw"],
E )
E from merlin.systems.triton.utils import run_ensemble_on_tritonserver
E response = run_ensemble_on_tritonserver(
E "/tmp/examples/poc_ensemble", ensemble.graph.input_schema, batch, outputs, "ensemble_model"
E )
E response = [x.tolist()[0] for x in response["ordered_ids"]]
E shutil.rmtree("/tmp/examples/", ignore_errors=True)
E
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mRuntimeError�[0m Traceback (most recent call last)
E Cell �[0;32mIn [32], line 12�[0m
E �[1;32m 6�[0m batch �[38;5;241m=�[39m df_lib�[38;5;241m.�[39mread_parquet(
E �[1;32m 7�[0m os�[38;5;241m.�[39mpath�[38;5;241m.�[39mjoin(�[38;5;124m"�[39m�[38;5;124m/tmp/data/processed_nvt/�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mtrain�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mpart_0.parquet�[39m�[38;5;124m"�[39m),
E �[1;32m 8�[0m num_rows�[38;5;241m=�[39m�[38;5;241m1�[39m,
E �[1;32m 9�[0m columns�[38;5;241m=�[39m[�[38;5;124m"�[39m�[38;5;124muser_id_raw�[39m�[38;5;124m"�[39m],
E �[1;32m 10�[0m )
E �[1;32m 11�[0m �[38;5;28;01mfrom�[39;00m �[38;5;21;01mmerlin�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01msystems�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mtriton�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mutils�[39;00m �[38;5;28;01mimport�[39;00m run_ensemble_on_tritonserver
E �[0;32m---> 12�[0m response �[38;5;241m=�[39m �[43mrun_ensemble_on_tritonserver�[49m�[43m(�[49m
E �[1;32m 13�[0m �[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43m/tmp/examples/poc_ensemble�[39;49m�[38;5;124;43m"�[39;49m�[43m,�[49m�[43m �[49m�[43mensemble�[49m�[38;5;241;43m.�[39;49m�[43mgraph�[49m�[38;5;241;43m.�[39;49m�[43minput_schema�[49m�[43m,�[49m�[43m �[49m�[43mbatch�[49m�[43m,�[49m�[43m �[49m�[43moutputs�[49m�[43m,�[49m�[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43mensemble_model�[39;49m�[38;5;124;43m"�[39;49m
E �[1;32m 14�[0m �[43m)�[49m
E �[1;32m 15�[0m response �[38;5;241m=�[39m [x�[38;5;241m.�[39mtolist()[�[38;5;241m0�[39m] �[38;5;28;01mfor�[39;00m x �[38;5;129;01min�[39;00m response[�[38;5;124m"�[39m�[38;5;124mordered_ids�[39m�[38;5;124m"�[39m]]
E �[1;32m 16�[0m shutil�[38;5;241m.�[39mrmtree(�[38;5;124m"�[39m�[38;5;124m/tmp/examples/�[39m�[38;5;124m"�[39m, ignore_errors�[38;5;241m=�[39m�[38;5;28;01mTrue�[39;00m)
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:139�[0m, in �[0;36mrun_ensemble_on_tritonserver�[0;34m(tmpdir, schema, df, output_columns, model_name)�[0m
E �[1;32m 116�[0m �[38;5;124;03m"""Starts up a Triton server instance, loads up the ensemble model,�[39;00m
E �[1;32m 117�[0m �[38;5;124;03mprepares the inference request and returns the unparsed inference�[39;00m
E �[1;32m 118�[0m �[38;5;124;03mresponse.�[39;00m
E �[0;32m (...)�[0m
E �[1;32m 136�[0m �[38;5;124;03m the results of the prediction, parsed by output column name.�[39;00m
E �[1;32m 137�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 138�[0m response �[38;5;241m=�[39m �[38;5;28;01mNone�[39;00m
E �[0;32m--> 139�[0m �[38;5;28;01mwith�[39;00m run_triton_server(tmpdir) �[38;5;28;01mas�[39;00m client:
E �[1;32m 140�[0m response �[38;5;241m=�[39m send_triton_request(
E �[1;32m 141�[0m schema, df, output_columns, client�[38;5;241m=�[39mclient, triton_model�[38;5;241m=�[39mmodel_name
E �[1;32m 142�[0m )
E �[1;32m 144�[0m �[38;5;28;01mreturn�[39;00m response
E
E File �[0;32m/usr/lib/python3.8/contextlib.py:113�[0m, in �[0;36m_GeneratorContextManager.__enter__�[0;34m(self)�[0m
E �[1;32m 111�[0m �[38;5;28;01mdel�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39margs, �[38;5;28mself�[39m�[38;5;241m.�[39mkwds, �[38;5;28mself�[39m�[38;5;241m.�[39mfunc
E �[1;32m 112�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m--> 113�[0m �[38;5;28;01mreturn�[39;00m �[38;5;28;43mnext�[39;49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[38;5;241;43m.�[39;49m�[43mgen�[49m�[43m)�[49m
E �[1;32m 114�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mStopIteration�[39;00m:
E �[1;32m 115�[0m �[38;5;28;01mraise�[39;00m �[38;5;167;01mRuntimeError�[39;00m(�[38;5;124m"�[39m�[38;5;124mgenerator didn�[39m�[38;5;124m'�[39m�[38;5;124mt yield�[39m�[38;5;124m"�[39m) �[38;5;28;01mfrom�[39;00m �[38;5;28mNone�[39m
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:82�[0m, in �[0;36mrun_triton_server�[0;34m(model_repository, grpc_host, grpc_port, backend_config)�[0m
E �[1;32m 80�[0m �[38;5;28;01mif�[39;00m process�[38;5;241m.�[39mpoll() �[38;5;129;01mis�[39;00m �[38;5;129;01mnot�[39;00m �[38;5;28;01mNone�[39;00m:
E �[1;32m 81�[0m retcode �[38;5;241m=�[39m process�[38;5;241m.�[39mreturncode
E �[0;32m---> 82�[0m �[38;5;28;01mraise�[39;00m �[38;5;167;01mRuntimeError�[39;00m(�[38;5;124mf�[39m�[38;5;124m"�[39m�[38;5;124mTritonserver failed to start (ret=�[39m�[38;5;132;01m{�[39;00mretcode�[38;5;132;01m}�[39;00m�[38;5;124m)�[39m�[38;5;124m"�[39m)
E �[1;32m 84�[0m �[38;5;28;01mtry�[39;00m:
E �[1;32m 85�[0m ready �[38;5;241m=�[39m client�[38;5;241m.�[39mis_server_ready()
E
E �[0;31mRuntimeError�[0m: Tritonserver failed to start (ret=1)
E RuntimeError: Tritonserver failed to start (ret=1)

../../../.local/lib/python3.8/site-packages/testbook/client.py:135: TestbookRuntimeError
----------------------------- Captured stderr call -----------------------------
2022-12-20 18:18:58.741104: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 18:19:02.788507: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 18:19:02.789222: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-12-20 18:19:02.789892: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-12-20 18:19:02.790569: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/usr/lib/python3.8/logging/init.py", line 2127, in shutdown
h.close()
File "/usr/local/lib/python3.8/dist-packages/absl/logging/init.py", line 934, in close
self.stream.close()
File "/usr/local/lib/python3.8/dist-packages/ipykernel/iostream.py", line 438, in close
self.watch_fd_thread.join()
AttributeError: 'OutStream' object has no attribute 'watch_fd_thread'
WARNING clustering 453 points to 32 centroids: please provide at least 1248 training points
2022-12-20 18:20:28.737870: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 18:20:32.757536: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 18:20:32.758320: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-12-20 18:20:32.758981: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-12-20 18:20:32.759601: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
I1220 18:20:50.808585 9129 pinned_memory_manager.cc:240] Pinned memory pool is created at '0x7f9926000000' with size 268435456
I1220 18:20:50.809326 9129 cuda_memory_manager.cc:105] CUDA memory pool is created on device 0 with size 536870912
I1220 18:20:50.814496 9129 model_lifecycle.cc:459] loading: executor_model:1
I1220 18:20:50.814540 9129 model_lifecycle.cc:459] loading: 0_predicttensorflowtriton:1
I1220 18:20:50.814573 9129 model_lifecycle.cc:459] loading: 2_predicttensorflowtriton:1
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
I1220 18:20:59.629576 9129 tensorflow.cc:2536] TRITONBACKEND_Initialize: tensorflow
I1220 18:20:59.629650 9129 tensorflow.cc:2546] Triton TRITONBACKEND API version: 1.10
I1220 18:20:59.629657 9129 tensorflow.cc:2552] 'tensorflow' TRITONBACKEND API version: 1.10
I1220 18:20:59.629663 9129 tensorflow.cc:2576] backend configuration:
{"cmdline":{"auto-complete-config":"true","backend-directory":"/opt/tritonserver/backends","min-compute-capability":"6.000000","version":"2","default-max-batch-size":"4"}}
I1220 18:20:59.629706 9129 python_be.cc:1767] TRITONBACKEND_ModelInstanceInitialize: executor_model (GPU device 0)
2022-12-20 18:21:07.537208: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 18:21:09.733508: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 18:21:09.740144: I tensorflow/stream_executor/cuda/cuda_driver.cc:739] failed to allocate 7.95G (8534360064 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory
2022-12-20 18:21:09.743071: I tensorflow/stream_executor/cuda/cuda_driver.cc:739] failed to allocate 7.15G (7680923648 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory
2022-12-20 18:21:09.746151: I tensorflow/stream_executor/cuda/cuda_driver.cc:739] failed to allocate 6.44G (6912830976 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory
2022-12-20 18:21:09.748839: I tensorflow/stream_executor/cuda/cuda_driver.cc:739] failed to allocate 5.79G (6221547520 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
WARNING:tensorflow:No training configuration found in save file, so the model was not compiled. Compile it manually.
12/20/2022 06:21:11 PM WARNING:No training configuration found in save file, so the model was not compiled. Compile it manually.
/usr/local/lib/python3.8/dist-packages/faiss/loader.py:28: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
if LooseVersion(numpy.version) >= "1.19":
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)
12/20/2022 06:21:11 PM INFO:Loading faiss with AVX2 support.
12/20/2022 06:21:11 PM INFO:Could not load library with AVX2 support due to:
ModuleNotFoundError("No module named 'faiss.swigfaiss_avx2'")
12/20/2022 06:21:11 PM INFO:Loading faiss.
12/20/2022 06:21:11 PM INFO:Successfully loaded faiss.
1220 18:21:18.464240 9213 pb_stub.cc:309] Failed to initialize Python stub: RuntimeError: Error in virtual void faiss::gpu::StandardGpuResourcesImpl::initializeForDevice(int) at /project/faiss/faiss/gpu/StandardGpuResources.cpp:283: Error: 'err == cudaSuccess' failed: failed to cudaHostAlloc 268435456 bytes for CPU <-> GPU async copy buffer (error 2 out of memory)

At:
/usr/local/lib/python3.8/dist-packages/faiss/swigfaiss.py(10275): index_cpu_to_gpu
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/faiss.py(74): load_artifacts
/tmp/examples/poc_ensemble/executor_model/1/model.py(75): initialize

I1220 18:21:19.910133 9129 tensorflow.cc:2642] TRITONBACKEND_ModelInitialize: 0_predicttensorflowtriton (version 1)
2022-12-20 18:21:19.911759: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
E1220 18:21:19.918660 9129 model_lifecycle.cc:596] failed to load 'executor_model' version 1: Internal: RuntimeError: Error in virtual void faiss::gpu::StandardGpuResourcesImpl::initializeForDevice(int) at /project/faiss/faiss/gpu/StandardGpuResources.cpp:283: Error: 'err == cudaSuccess' failed: failed to cudaHostAlloc 268435456 bytes for CPU <-> GPU async copy buffer (error 2 out of memory)

At:
/usr/local/lib/python3.8/dist-packages/faiss/swigfaiss.py(10275): index_cpu_to_gpu
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/faiss.py(74): load_artifacts
/tmp/examples/poc_ensemble/executor_model/1/model.py(75): initialize

2022-12-20 18:21:19.923804: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 18:21:19.923871: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:19.924111: I tensorflow/core/platform/cpu_feature_guard.cc:194] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: SSE3 SSE4.1 SSE4.2 AVX
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 18:21:19.967466: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 5680 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 18:21:20.005493: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:354] MLIR V1 optimization pass is not enabled
2022-12-20 18:21:20.007482: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 18:21:20.083105: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.114419: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 202696 microseconds.
I1220 18:21:20.120779 9129 tensorflow.cc:2642] TRITONBACKEND_ModelInitialize: 2_predicttensorflowtriton (version 1)
2022-12-20 18:21:20.122021: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.149685: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 18:21:20.149736: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.151866: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 5680 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 18:21:20.181054: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 18:21:20.321559: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.374273: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 252260 microseconds.
I1220 18:21:20.391201 9129 tensorflow.cc:2691] TRITONBACKEND_ModelInstanceInitialize: 0_predicttensorflowtriton (GPU device 0)
2022-12-20 18:21:20.392099: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.397126: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 18:21:20.397163: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.399988: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 5680 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 18:21:20.411178: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 18:21:20.488032: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.514427: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 122334 microseconds.
I1220 18:21:20.514534 9129 tensorflow.cc:2691] TRITONBACKEND_ModelInstanceInitialize: 2_predicttensorflowtriton (GPU device 0)
I1220 18:21:20.514693 9129 model_lifecycle.cc:693] successfully loaded '0_predicttensorflowtriton' version 1
2022-12-20 18:21:20.514980: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.526297: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 18:21:20.526339: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.528662: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 5680 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 18:21:20.548711: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 18:21:20.693005: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 18:21:20.744085: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 229107 microseconds.
I1220 18:21:20.744391 9129 model_lifecycle.cc:693] successfully loaded '2_predicttensorflowtriton' version 1
I1220 18:21:20.744510 9129 server.cc:561]
+------------------+------+
| Repository Agent | Path |
+------------------+------+
+------------------+------+

I1220 18:21:20.744633 9129 server.cc:588]
+------------+-----------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Backend | Path | Config |
+------------+-----------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| python | /opt/tritonserver/backends/python/libtriton_python.so | {"cmdline":{"auto-complete-config":"true","min-compute-capability":"6.000000","backend-directory":"/opt/tritonserver/backends","default-max-batch-size":"4"}} |
| tensorflow | /opt/tritonserver/backends/tensorflow2/libtriton_tensorflow2.so | {"cmdline":{"auto-complete-config":"true","backend-directory":"/opt/tritonserver/backends","min-compute-capability":"6.000000","version":"2","default-max-batch-size":"4"}} |
+------------+-----------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I1220 18:21:20.744800 9129 server.cc:631]
+---------------------------+---------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Model | Version | Status |
+---------------------------+---------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| 0_predicttensorflowtriton | 1 | READY |
| 2_predicttensorflowtriton | 1 | READY |
| executor_model | 1 | UNAVAILABLE: Internal: RuntimeError: Error in virtual void faiss::gpu::StandardGpuResourcesImpl::initializeForDevice(int) at /project/faiss/faiss/gpu/StandardGpuResources.cpp:283: Error: 'err == cudaSuccess' failed: failed to cudaHostAlloc 268435456 bytes for CPU <-> GPU async copy buffer (error 2 out of memory) |
| | | |
| | | At: |
| | | /usr/local/lib/python3.8/dist-packages/faiss/swigfaiss.py(10275): index_cpu_to_gpu |
| | | /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/faiss.py(74): load_artifacts |
| | | /tmp/examples/poc_ensemble/executor_model/1/model.py(75): initialize |
+---------------------------+---------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I1220 18:21:20.808945 9129 metrics.cc:650] Collecting metrics for GPU 0: Tesla P100-DGXS-16GB
I1220 18:21:20.809798 9129 tritonserver.cc:2214]
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Option | Value |
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| server_id | triton |
| server_version | 2.25.0 |
| server_extensions | classification sequence model_repository model_repository(unload_dependents) schedule_policy model_configuration system_shared_memory cuda_shared_memory binary_tensor_data statistics trace |
| model_repository_path[0] | /tmp/examples/poc_ensemble |
| model_control_mode | MODE_NONE |
| strict_model_config | 0 |
| rate_limit | OFF |
| pinned_memory_pool_byte_size | 268435456 |
| cuda_memory_pool_byte_size{0} | 536870912 |
| response_cache_byte_size | 0 |
| min_supported_compute_capability | 6.0 |
| strict_readiness | 1 |
| exit_timeout | 30 |
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I1220 18:21:20.809812 9129 server.cc:262] Waiting for in-flight requests to complete.
I1220 18:21:20.809858 9129 server.cc:278] Timeout 30: Found 0 model versions that have in-flight inferences
I1220 18:21:20.809946 9129 server.cc:293] All models are stopped, unloading models
I1220 18:21:20.809956 9129 server.cc:300] Timeout 30: Found 2 live models and 0 in-flight non-inference requests
I1220 18:21:20.810045 9129 tensorflow.cc:2729] TRITONBACKEND_ModelInstanceFinalize: delete instance state
I1220 18:21:20.810076 9129 tensorflow.cc:2729] TRITONBACKEND_ModelInstanceFinalize: delete instance state
I1220 18:21:20.810179 9129 tensorflow.cc:2668] TRITONBACKEND_ModelFinalize: delete model state
I1220 18:21:20.810194 9129 tensorflow.cc:2668] TRITONBACKEND_ModelFinalize: delete model state
I1220 18:21:20.822124 9129 model_lifecycle.cc:578] successfully unloaded '0_predicttensorflowtriton' version 1
I1220 18:21:20.833204 9129 model_lifecycle.cc:578] successfully unloaded '2_predicttensorflowtriton' version 1
I1220 18:21:21.810044 9129 server.cc:300] Timeout 29: Found 0 live models and 0 in-flight non-inference requests
error: creating server: Internal - failed to load all models
W1220 18:21:21.831296 9129 metrics.cc:468] Unable to get energy consumption for GPU 0. Status:Success, value:0
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/usr/lib/python3.8/logging/init.py", line 2127, in shutdown
h.close()
File "/usr/local/lib/python3.8/dist-packages/absl/logging/init.py", line 934, in close
self.stream.close()
File "/usr/local/lib/python3.8/dist-packages/ipykernel/iostream.py", line 438, in close
self.watch_fd_thread.join()
AttributeError: 'OutStream' object has no attribute 'watch_fd_thread'
__________________________________ test_func ___________________________________

self = <testbook.client.TestbookNotebookClient object at 0x7f6420549730>
cell = {'cell_type': 'markdown', 'id': 'd1c899fd', 'metadata': {}, 'source': 'We generate the Triton Inference Server artifacts and export them in the export_path directory.'}
kwargs = {}, cell_indexes = [0, 1, 2, 3, 4, 5, ...]
executed_cells = [{'cell_type': 'code', 'execution_count': 2, 'id': '47c9026a', 'metadata': {'execution': {'iopub.status.busy': '2022-1...: 'markdown', 'id': '25a6ac04', 'metadata': {}, 'source': 'We define the path for the saved workflow and model.'}, ...]
idx = 19

def execute_cell(self, cell, **kwargs) -> Union[Dict, List[Dict]]:
    """
    Executes a cell or list of cells
    """
    if isinstance(cell, slice):
        start, stop = self._cell_index(cell.start), self._cell_index(cell.stop)
        if cell.step is not None:
            raise TestbookError('testbook does not support step argument')

        cell = range(start, stop + 1)
    elif isinstance(cell, str) or isinstance(cell, int):
        cell = [cell]

    cell_indexes = cell

    if all(isinstance(x, str) for x in cell):
        cell_indexes = [self._cell_index(tag) for tag in cell]

    executed_cells = []
    for idx in cell_indexes:
        try:
          cell = super().execute_cell(self.nb['cells'][idx], idx, **kwargs)

../../../.local/lib/python3.8/site-packages/testbook/client.py:133:


args = (<testbook.client.TestbookNotebookClient object at 0x7f6420549730>, {'cell_type': 'code', 'execution_count': 10, 'id':...ource': 'export_path = os.path.join(input_path, "ensemble")\nens_conf, node_confs = ensemble.export(export_path)'}, 19)
kwargs = {}

def wrapped(*args, **kwargs):
  return just_run(coro(*args, **kwargs))

../../../.local/lib/python3.8/site-packages/nbclient/util.py:84:


coro = <coroutine object NotebookClient.async_execute_cell at 0x7f641fbf4740>

def just_run(coro: Awaitable) -> Any:
    """Make the coroutine run, even if there is an event loop running (using nest_asyncio)"""
    # original from vaex/asyncio.py
    loop = asyncio._get_running_loop()
    if loop is None:
        had_running_loop = False
        try:
            loop = asyncio.get_event_loop()
        except RuntimeError:
            # we can still get 'There is no current event loop in ...'
            loop = asyncio.new_event_loop()
            asyncio.set_event_loop(loop)
    else:
        had_running_loop = True
    if had_running_loop:
        # if there is a running loop, we patch using nest_asyncio
        # to have reentrant event loops
        check_ipython()
        import nest_asyncio

        nest_asyncio.apply()
        check_patch_tornado()
  return loop.run_until_complete(coro)

../../../.local/lib/python3.8/site-packages/nbclient/util.py:62:


self = <_UnixSelectorEventLoop running=False closed=False debug=False>
future = <Task finished name='Task-596' coro=<NotebookClient.async_execute_cell() done, defined at /var/jenkins_home/.local/lib...n'\nFileNotFoundError: [Errno 2] No such file or directory: 'ram:/1a1f2b6f-32e0-4a5c-8947-7fbeeb3ee8c7/.merlin'\n')>

def run_until_complete(self, future):
    """Run until the Future is done.

    If the argument is a coroutine, it is wrapped in a Task.

    WARNING: It would be disastrous to call run_until_complete()
    with the same coroutine twice -- it would wrap it in two
    different Tasks and that can't be good.

    Return the Future's result, or raise its exception.
    """
    self._check_closed()
    self._check_running()

    new_task = not futures.isfuture(future)
    future = tasks.ensure_future(future, loop=self)
    if new_task:
        # An exception is raised if the future didn't complete, so there
        # is no need to log the "destroy pending task" message
        future._log_destroy_pending = False

    future.add_done_callback(_run_until_complete_cb)
    try:
        self.run_forever()
    except:
        if new_task and future.done() and not future.cancelled():
            # The coroutine raised a BaseException. Consume the exception
            # to not log a warning, the caller doesn't have access to the
            # local task.
            future.exception()
        raise
    finally:
        future.remove_done_callback(_run_until_complete_cb)
    if not future.done():
        raise RuntimeError('Event loop stopped before Future completed.')
  return future.result()

/usr/lib/python3.8/asyncio/base_events.py:616:


self = <testbook.client.TestbookNotebookClient object at 0x7f6420549730>
cell = {'cell_type': 'code', 'execution_count': 10, 'id': 'e27246ff', 'metadata': {'execution': {'iopub.status.busy': '2022-1...], 'source': 'export_path = os.path.join(input_path, "ensemble")\nens_conf, node_confs = ensemble.export(export_path)'}
cell_index = 19, execution_count = None, store_history = True

async def async_execute_cell(
    self,
    cell: NotebookNode,
    cell_index: int,
    execution_count: t.Optional[int] = None,
    store_history: bool = True,
) -> NotebookNode:
    """
    Executes a single code cell.

    To execute all cells see :meth:`execute`.

    Parameters
    ----------
    cell : nbformat.NotebookNode
        The cell which is currently being processed.
    cell_index : int
        The position of the cell within the notebook object.
    execution_count : int
        The execution count to be assigned to the cell (default: Use kernel response)
    store_history : bool
        Determines if history should be stored in the kernel (default: False).
        Specific to ipython kernels, which can store command histories.

    Returns
    -------
    output : dict
        The execution output payload (or None for no output).

    Raises
    ------
    CellExecutionError
        If execution failed and should raise an exception, this will be raised
        with defaults about the failure.

    Returns
    -------
    cell : NotebookNode
        The cell which was just processed.
    """
    assert self.kc is not None

    await run_hook(self.on_cell_start, cell=cell, cell_index=cell_index)

    if cell.cell_type != 'code' or not cell.source.strip():
        self.log.debug("Skipping non-executing cell %s", cell_index)
        return cell

    if self.skip_cells_with_tag in cell.metadata.get("tags", []):
        self.log.debug("Skipping tagged cell %s", cell_index)
        return cell

    if self.record_timing:  # clear execution metadata prior to execution
        cell['metadata']['execution'] = {}

    self.log.debug("Executing cell:\n%s", cell.source)

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors or "raises-exception" in cell.metadata.get("tags", [])
    )

    await run_hook(self.on_cell_execute, cell=cell, cell_index=cell_index)
    parent_msg_id = await ensure_async(
        self.kc.execute(
            cell.source, store_history=store_history, stop_on_error=not cell_allows_errors
        )
    )
    await run_hook(self.on_cell_complete, cell=cell, cell_index=cell_index)
    # We launched a code cell to execute
    self.code_cells_executed += 1
    exec_timeout = self._get_timeout(cell)

    cell.outputs = []
    self.clear_before_next_output = False

    task_poll_kernel_alive = asyncio.ensure_future(self._async_poll_kernel_alive())
    task_poll_output_msg = asyncio.ensure_future(
        self._async_poll_output_msg(parent_msg_id, cell, cell_index)
    )
    self.task_poll_for_reply = asyncio.ensure_future(
        self._async_poll_for_reply(
            parent_msg_id, cell, exec_timeout, task_poll_output_msg, task_poll_kernel_alive
        )
    )
    try:
        exec_reply = await self.task_poll_for_reply
    except asyncio.CancelledError:
        # can only be cancelled by task_poll_kernel_alive when the kernel is dead
        task_poll_output_msg.cancel()
        raise DeadKernelError("Kernel died")
    except Exception as e:
        # Best effort to cancel request if it hasn't been resolved
        try:
            # Check if the task_poll_output is doing the raising for us
            if not isinstance(e, CellControlSignal):
                task_poll_output_msg.cancel()
        finally:
            raise

    if execution_count:
        cell['execution_count'] = execution_count
  await self._check_raise_for_error(cell, cell_index, exec_reply)

../../../.local/lib/python3.8/site-packages/nbclient/client.py:965:


self = <testbook.client.TestbookNotebookClient object at 0x7f6420549730>
cell = {'cell_type': 'code', 'execution_count': 10, 'id': 'e27246ff', 'metadata': {'execution': {'iopub.status.busy': '2022-1...], 'source': 'export_path = os.path.join(input_path, "ensemble")\nens_conf, node_confs = ensemble.export(export_path)'}
cell_index = 19
exec_reply = {'buffers': [], 'content': {'ename': 'FileNotFoundError', 'engine_info': {'engine_id': -1, 'engine_uuid': 'e8f74e51-37...e, 'engine': 'e8f74e51-374c-4188-a0c5-c2bef5a8651e', 'started': '2022-12-20T18:23:15.307906Z', 'status': 'error'}, ...}

async def _check_raise_for_error(
    self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
) -> None:

    if exec_reply is None:
        return None

    exec_reply_content = exec_reply['content']
    if exec_reply_content['status'] != 'error':
        return None

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors
        or exec_reply_content.get('ename') in self.allow_error_names
        or "raises-exception" in cell.metadata.get("tags", [])
    )
    await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
    if not cell_allows_errors:
      raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)

E nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E export_path = os.path.join(input_path, "ensemble")
E ens_conf, node_confs = ensemble.export(export_path)
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mFileNotFoundError�[0m Traceback (most recent call last)
E Cell �[0;32mIn [10], line 2�[0m
E �[1;32m 1�[0m export_path �[38;5;241m=�[39m os�[38;5;241m.�[39mpath�[38;5;241m.�[39mjoin(input_path, �[38;5;124m"�[39m�[38;5;124mensemble�[39m�[38;5;124m"�[39m)
E �[0;32m----> 2�[0m ens_conf, node_confs �[38;5;241m=�[39m �[43mensemble�[49m�[38;5;241;43m.�[39;49m�[43mexport�[49m�[43m(�[49m�[43mexport_path�[49m�[43m)�[49m
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ensemble.py:153�[0m, in �[0;36mEnsemble.export�[0;34m(self, export_path, runtime, **kwargs)�[0m
E �[1;32m 148�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 149�[0m �[38;5;124;03mWrite out an ensemble model configuration directory. The exported�[39;00m
E �[1;32m 150�[0m �[38;5;124;03mensemble is designed for use with Triton Inference Server.�[39;00m
E �[1;32m 151�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 152�[0m runtime �[38;5;241m=�[39m runtime �[38;5;129;01mor�[39;00m TritonExecutorRuntime()
E �[0;32m--> 153�[0m �[38;5;28;01mreturn�[39;00m �[43mruntime�[49m�[38;5;241;43m.�[39;49m�[43mexport�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43mexport_path�[49m�[43m,�[49m�[43m �[49m�[38;5;241;43m�[39;49m�[38;5;241;43m�[39;49m�[43mkwargs�[49m�[43m)�[49m
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/runtime.py:135�[0m, in �[0;36mTritonExecutorRuntime.export�[0;34m(self, ensemble, path, version, name)�[0m
E �[1;32m 132�[0m �[38;5;28;01mif�[39;00m node_config �[38;5;129;01mis�[39;00m �[38;5;129;01mnot�[39;00m �[38;5;28;01mNone�[39;00m:
E �[1;32m 133�[0m node_configs�[38;5;241m.�[39mappend(node_config)
E �[0;32m--> 135�[0m executor_config �[38;5;241m=�[39m �[38;5;28;43mself�[39;49m�[38;5;241;43m.�[39;49m�[43m_executor_model_export�[49m�[43m(�[49m�[43mpath�[49m�[43m,�[49m�[43m �[49m�[43mname�[49m�[43m,�[49m�[43m �[49m�[43mensemble�[49m�[43m)�[49m
E �[1;32m 137�[0m �[38;5;28;01mreturn�[39;00m (executor_config, node_configs)
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/runtime.py:206�[0m, in �[0;36mTritonExecutorRuntime._executor_model_export�[0;34m(self, path, export_name, ensemble, params, node_id, version)�[0m
E �[1;32m 198�[0m �[38;5;28;01mwith�[39;00m importlib�[38;5;241m.�[39mresources�[38;5;241m.�[39mpath(
E �[1;32m 199�[0m �[38;5;124m"�[39m�[38;5;124mmerlin.systems.triton.models�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mexecutor_model.py�[39m�[38;5;124m"�[39m
E �[1;32m 200�[0m ) �[38;5;28;01mas�[39;00m executor_model:
E �[1;32m 201�[0m copyfile(
E �[1;32m 202�[0m executor_model,
E �[1;32m 203�[0m os�[38;5;241m.�[39mpath�[38;5;241m.�[39mjoin(node_export_path, �[38;5;28mstr�[39m(version), �[38;5;124m"�[39m�[38;5;124mmodel.py�[39m�[38;5;124m"�[39m),
E �[1;32m 204�[0m )
E �[0;32m--> 206�[0m �[43mensemble�[49m�[38;5;241;43m.�[39;49m�[43msave�[49m�[43m(�[49m�[43mos�[49m�[38;5;241;43m.�[39;49m�[43mpath�[49m�[38;5;241;43m.�[39;49m�[43mjoin�[49m�[43m(�[49m�[43mnode_export_path�[49m�[43m,�[49m�[43m �[49m�[38;5;28;43mstr�[39;49m�[43m(�[49m�[43mversion�[49m�[43m)�[49m�[43m,�[49m�[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43mensemble�[39;49m�[38;5;124;43m"�[39;49m�[43m)�[49m�[43m)�[49m
E �[1;32m 208�[0m �[38;5;28;01mreturn�[39;00m config
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ensemble.py:107�[0m, in �[0;36mEnsemble.save�[0;34m(self, path)�[0m
E �[1;32m 105�[0m �[38;5;66;03m# dump out the full workflow (graph/stats/operators etc) using cloudpickle�[39;00m
E �[1;32m 106�[0m �[38;5;28;01mwith�[39;00m fs�[38;5;241m.�[39mopen(fs�[38;5;241m.�[39msep�[38;5;241m.�[39mjoin([path, �[38;5;124m"�[39m�[38;5;124mensemble.pkl�[39m�[38;5;124m"�[39m]), �[38;5;124m"�[39m�[38;5;124mwb�[39m�[38;5;124m"�[39m) �[38;5;28;01mas�[39;00m o:
E �[0;32m--> 107�[0m �[43mcloudpickle�[49m�[38;5;241;43m.�[39;49m�[43mdump�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43mo�[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/cloudpickle/cloudpickle_fast.py:55�[0m, in �[0;36mdump�[0;34m(obj, file, protocol, buffer_callback)�[0m
E �[1;32m 45�[0m �[38;5;28;01mdef�[39;00m �[38;5;21mdump�[39m(obj, file, protocol�[38;5;241m=�[39m�[38;5;28;01mNone�[39;00m, buffer_callback�[38;5;241m=�[39m�[38;5;28;01mNone�[39;00m):
E �[1;32m 46�[0m �[38;5;124;03m"""Serialize obj as bytes streamed into file�[39;00m
E �[1;32m 47�[0m
E �[1;32m 48�[0m �[38;5;124;03m protocol defaults to cloudpickle.DEFAULT_PROTOCOL which is an alias to�[39;00m
E �[0;32m (...)�[0m
E �[1;32m 53�[0m �[38;5;124;03m compatibility with older versions of Python.�[39;00m
E �[1;32m 54�[0m �[38;5;124;03m """�[39;00m
E �[0;32m---> 55�[0m �[43mCloudPickler�[49m�[43m(�[49m
E �[1;32m 56�[0m �[43m �[49m�[43mfile�[49m�[43m,�[49m�[43m �[49m�[43mprotocol�[49m�[38;5;241;43m=�[39;49m�[43mprotocol�[49m�[43m,�[49m�[43m �[49m�[43mbuffer_callback�[49m�[38;5;241;43m=�[39;49m�[43mbuffer_callback�[49m
E �[1;32m 57�[0m �[43m �[49m�[43m)�[49m�[38;5;241;43m.�[39;49m�[43mdump�[49m�[43m(�[49m�[43mobj�[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/cloudpickle/cloudpickle_fast.py:632�[0m, in �[0;36mCloudPickler.dump�[0;34m(self, obj)�[0m
E �[1;32m 630�[0m �[38;5;28;01mdef�[39;00m �[38;5;21mdump�[39m(�[38;5;28mself�[39m, obj):
E �[1;32m 631�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m--> 632�[0m �[38;5;28;01mreturn�[39;00m �[43mPickler�[49m�[38;5;241;43m.�[39;49m�[43mdump�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43mobj�[49m�[43m)�[49m
E �[1;32m 633�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mRuntimeError�[39;00m �[38;5;28;01mas�[39;00m e:
E �[1;32m 634�[0m �[38;5;28;01mif�[39;00m �[38;5;124m"�[39m�[38;5;124mrecursion�[39m�[38;5;124m"�[39m �[38;5;129;01min�[39;00m e�[38;5;241m.�[39margs[�[38;5;241m0�[39m]:
E
E File �[0;32m~/.local/lib/python3.8/site-packages/keras/engine/training.py:324�[0m, in �[0;36mModel.reduce__�[0;34m(self)�[0m
E �[1;32m 321�[0m �[38;5;28;01mdef�[39;00m �[38;5;21m__reduce__�[39m(�[38;5;28mself�[39m):
E �[1;32m 322�[0m �[38;5;28;01mif�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mbuilt:
E �[1;32m 323�[0m �[38;5;28;01mreturn�[39;00m (pickle_utils�[38;5;241m.�[39mdeserialize_model_from_bytecode,
E �[0;32m--> 324�[0m �[43mpickle_utils�[49m�[38;5;241;43m.�[39;49m�[43mserialize_model_as_bytecode�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m)�[49m)
E �[1;32m 325�[0m �[38;5;28;01melse�[39;00m:
E �[1;32m 326�[0m �[38;5;66;03m# SavedModel (and hence serialize_model_as_bytecode) only support�[39;00m
E �[1;32m 327�[0m �[38;5;66;03m# built models, but if the model is not built,�[39;00m
E �[0;32m (...)�[0m
E �[1;32m 331�[0m �[38;5;66;03m# Thus we call up the superclass hierarchy to get an implementation of�[39;00m
E �[1;32m 332�[0m �[38;5;66;03m# reduce that can pickle this Model as a plain Python object.�[39;00m
E �[1;32m 333�[0m �[38;5;28;01mreturn�[39;00m �[38;5;28msuper�[39m(Model, �[38;5;28mself�[39m)�[38;5;241m.�[39m__reduce
()
E
E File �[0;32m~/.local/lib/python3.8/site-packages/keras/saving/pickle_utils.py:64�[0m, in �[0;36mserialize_model_as_bytecode�[0;34m(model)�[0m
E �[1;32m 54�[0m �[38;5;124;03m"""Convert a Keras Model into a bytecode representation for pickling.�[39;00m
E �[1;32m 55�[0m
E �[1;32m 56�[0m �[38;5;124;03mArgs:�[39;00m
E �[0;32m (...)�[0m
E �[1;32m 61�[0m �[38;5;124;03m deserialize_from_bytecode.�[39;00m
E �[1;32m 62�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 63�[0m temp_dir �[38;5;241m=�[39m �[38;5;124mf�[39m�[38;5;124m"�[39m�[38;5;124mram://�[39m�[38;5;132;01m{�[39;00muuid�[38;5;241m.�[39muuid4()�[38;5;132;01m}�[39;00m�[38;5;124m"�[39m
E �[0;32m---> 64�[0m �[43mmodel�[49m�[38;5;241;43m.�[39;49m�[43msave�[49m�[43m(�[49m�[43mtemp_dir�[49m�[43m)�[49m
E �[1;32m 65�[0m b �[38;5;241m=�[39m io�[38;5;241m.�[39mBytesIO()
E �[1;32m 66�[0m �[38;5;28;01mwith�[39;00m tarfile�[38;5;241m.�[39mopen(fileobj�[38;5;241m=�[39mb, mode�[38;5;241m=�[39m�[38;5;124m"�[39m�[38;5;124mw�[39m�[38;5;124m"�[39m) �[38;5;28;01mas�[39;00m archive:
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/base.py:1189�[0m, in �[0;36mModel.save�[0;34m(self, export_path, include_optimizer, save_traces)�[0m
E �[1;32m 1187�[0m input_schema �[38;5;241m=�[39m �[38;5;28mself�[39m�[38;5;241m.�[39mschema
E �[1;32m 1188�[0m output_schema �[38;5;241m=�[39m get_output_schema(export_path)
E �[0;32m-> 1189�[0m �[43msave_merlin_metadata�[49m�[43m(�[49m�[43mexport_path�[49m�[43m,�[49m�[43m �[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43minput_schema�[49m�[43m,�[49m�[43m �[49m�[43moutput_schema�[49m�[43m)�[49m
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/models/io.py:36�[0m, in �[0;36msave_merlin_metadata�[0;34m(export_path, model, input_schema, output_schema)�[0m
E �[1;32m 34�[0m export_path �[38;5;241m=�[39m pathlib�[38;5;241m.�[39mPath(export_path)
E �[1;32m 35�[0m merlin_metadata_dir �[38;5;241m=�[39m export_path �[38;5;241m/�[39m _MERLIN_METADATA_DIR_NAME
E �[0;32m---> 36�[0m �[43mmerlin_metadata_dir�[49m�[38;5;241;43m.�[39;49m�[43mmkdir�[49m�[43m(�[49m�[43mexist_ok�[49m�[38;5;241;43m=�[39;49m�[38;5;28;43;01mTrue�[39;49;00m�[43m)�[49m
E �[1;32m 38�[0m �[38;5;28;01mif�[39;00m input_schema �[38;5;129;01mis�[39;00m �[38;5;129;01mnot�[39;00m �[38;5;28;01mNone�[39;00m:
E �[1;32m 39�[0m schema_to_tensorflow_metadata_json(
E �[1;32m 40�[0m input_schema,
E �[1;32m 41�[0m merlin_metadata_dir �[38;5;241m/�[39m �[38;5;124m"�[39m�[38;5;124minput_schema.json�[39m�[38;5;124m"�[39m,
E �[1;32m 42�[0m )
E
E File �[0;32m/usr/lib/python3.8/pathlib.py:1288�[0m, in �[0;36mPath.mkdir�[0;34m(self, mode, parents, exist_ok)�[0m
E �[1;32m 1286�[0m �[38;5;28mself�[39m�[38;5;241m.�[39m_raise_closed()
E �[1;32m 1287�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m-> 1288�[0m �[38;5;28;43mself�[39;49m�[38;5;241;43m.�[39;49m�[43m_accessor�[49m�[38;5;241;43m.�[39;49m�[43mmkdir�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43mmode�[49m�[43m)�[49m
E �[1;32m 1289�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mFileNotFoundError�[39;00m:
E �[1;32m 1290�[0m �[38;5;28;01mif�[39;00m �[38;5;129;01mnot�[39;00m parents �[38;5;129;01mor�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mparent �[38;5;241m==�[39m �[38;5;28mself�[39m:
E
E �[0;31mFileNotFoundError�[0m: [Errno 2] No such file or directory: 'ram:/1a1f2b6f-32e0-4a5c-8947-7fbeeb3ee8c7/.merlin'
E FileNotFoundError: [Errno 2] No such file or directory: 'ram:/1a1f2b6f-32e0-4a5c-8947-7fbeeb3ee8c7/.merlin'

../../../.local/lib/python3.8/site-packages/nbclient/client.py:862: CellExecutionError

During handling of the above exception, another exception occurred:

def test_func():
    with testbook(
        REPO_ROOT / "examples" / "scaling-criteo" / "02-ETL-with-NVTabular.ipynb",
        execute=False,
        timeout=180,
    ) as tb1:
        tb1.inject(
            """
            import os
            os.environ["BASE_DIR"] = "/tmp/input/criteo/"
            os.environ["INPUT_DATA_DIR"] = "/tmp/input/criteo/"
            os.environ["OUTPUT_DATA_DIR"] = "/tmp/output/criteo/"
            os.system("mkdir -p /tmp/input/criteo")
            os.system("mkdir -p /tmp/output/criteo")

            from merlin.datasets.synthetic import generate_data

            train, valid = generate_data("criteo", int(100000), set_sizes=(0.7, 0.3))

            train.to_ddf().compute().to_parquet('/tmp/input/criteo/day_0.parquet')
            valid.to_ddf().compute().to_parquet('/tmp/input/criteo/day_1.parquet')
            """
        )
        tb1.execute()
        assert os.path.isfile("/tmp/output/criteo/train/part_0.parquet")
        assert os.path.isfile("/tmp/output/criteo/valid/part_0.parquet")
        assert os.path.isfile("/tmp/output/criteo/workflow/metadata.json")

    with testbook(
        REPO_ROOT
        / "examples"
        / "scaling-criteo"
        / "03-Training-with-Merlin-Models-TensorFlow.ipynb",
        execute=False,
        timeout=180,
    ) as tb2:
        tb2.inject(
            """
            import os
            os.environ["INPUT_DATA_DIR"] = "/tmp/output/criteo/"
            """
        )
        tb2.execute()
        metrics = tb2.ref("eval_metrics")
        assert set(metrics.keys()) == set(
            [
                "auc",
                "binary_accuracy",
                "loss",
                "precision",
                "recall",
                "regularization_loss",
                "loss_batch",
            ]
        )
        assert os.path.isfile("/tmp/output/criteo/dlrm/saved_model.pb")

    with testbook(
        REPO_ROOT
        / "examples"
        / "scaling-criteo"
        / "04-Triton-Inference-with-Merlin-Models-TensorFlow.ipynb",
        execute=False,
        timeout=180,
    ) as tb3:
        tb3.inject(
            """
            import os
            os.environ["BASE_DIR"] = "/tmp/output/criteo/"
            os.environ["INPUT_FOLDER"] = "/tmp/input/criteo/"
            """
        )
        NUM_OF_CELLS = len(tb3.cells)
      tb3.execute_cell(list(range(0, NUM_OF_CELLS - 5)))

tests/unit/examples/test_scaling_criteo_merlin_models.py:83:


self = <testbook.client.TestbookNotebookClient object at 0x7f6420549730>
cell = {'cell_type': 'markdown', 'id': 'd1c899fd', 'metadata': {}, 'source': 'We generate the Triton Inference Server artifacts and export them in the export_path directory.'}
kwargs = {}, cell_indexes = [0, 1, 2, 3, 4, 5, ...]
executed_cells = [{'cell_type': 'code', 'execution_count': 2, 'id': '47c9026a', 'metadata': {'execution': {'iopub.status.busy': '2022-1...: 'markdown', 'id': '25a6ac04', 'metadata': {}, 'source': 'We define the path for the saved workflow and model.'}, ...]
idx = 19

def execute_cell(self, cell, **kwargs) -> Union[Dict, List[Dict]]:
    """
    Executes a cell or list of cells
    """
    if isinstance(cell, slice):
        start, stop = self._cell_index(cell.start), self._cell_index(cell.stop)
        if cell.step is not None:
            raise TestbookError('testbook does not support step argument')

        cell = range(start, stop + 1)
    elif isinstance(cell, str) or isinstance(cell, int):
        cell = [cell]

    cell_indexes = cell

    if all(isinstance(x, str) for x in cell):
        cell_indexes = [self._cell_index(tag) for tag in cell]

    executed_cells = []
    for idx in cell_indexes:
        try:
            cell = super().execute_cell(self.nb['cells'][idx], idx, **kwargs)
        except CellExecutionError as ce:
          raise TestbookRuntimeError(ce.evalue, ce, self._get_error_class(ce.ename))

E testbook.exceptions.TestbookRuntimeError: An error occurred while executing the following cell:
E ------------------
E export_path = os.path.join(input_path, "ensemble")
E ens_conf, node_confs = ensemble.export(export_path)
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mFileNotFoundError�[0m Traceback (most recent call last)
E Cell �[0;32mIn [10], line 2�[0m
E �[1;32m 1�[0m export_path �[38;5;241m=�[39m os�[38;5;241m.�[39mpath�[38;5;241m.�[39mjoin(input_path, �[38;5;124m"�[39m�[38;5;124mensemble�[39m�[38;5;124m"�[39m)
E �[0;32m----> 2�[0m ens_conf, node_confs �[38;5;241m=�[39m �[43mensemble�[49m�[38;5;241;43m.�[39;49m�[43mexport�[49m�[43m(�[49m�[43mexport_path�[49m�[43m)�[49m
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ensemble.py:153�[0m, in �[0;36mEnsemble.export�[0;34m(self, export_path, runtime, **kwargs)�[0m
E �[1;32m 148�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 149�[0m �[38;5;124;03mWrite out an ensemble model configuration directory. The exported�[39;00m
E �[1;32m 150�[0m �[38;5;124;03mensemble is designed for use with Triton Inference Server.�[39;00m
E �[1;32m 151�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 152�[0m runtime �[38;5;241m=�[39m runtime �[38;5;129;01mor�[39;00m TritonExecutorRuntime()
E �[0;32m--> 153�[0m �[38;5;28;01mreturn�[39;00m �[43mruntime�[49m�[38;5;241;43m.�[39;49m�[43mexport�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43mexport_path�[49m�[43m,�[49m�[43m �[49m�[38;5;241;43m�[39;49m�[38;5;241;43m�[39;49m�[43mkwargs�[49m�[43m)�[49m
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/runtime.py:135�[0m, in �[0;36mTritonExecutorRuntime.export�[0;34m(self, ensemble, path, version, name)�[0m
E �[1;32m 132�[0m �[38;5;28;01mif�[39;00m node_config �[38;5;129;01mis�[39;00m �[38;5;129;01mnot�[39;00m �[38;5;28;01mNone�[39;00m:
E �[1;32m 133�[0m node_configs�[38;5;241m.�[39mappend(node_config)
E �[0;32m--> 135�[0m executor_config �[38;5;241m=�[39m �[38;5;28;43mself�[39;49m�[38;5;241;43m.�[39;49m�[43m_executor_model_export�[49m�[43m(�[49m�[43mpath�[49m�[43m,�[49m�[43m �[49m�[43mname�[49m�[43m,�[49m�[43m �[49m�[43mensemble�[49m�[43m)�[49m
E �[1;32m 137�[0m �[38;5;28;01mreturn�[39;00m (executor_config, node_configs)
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/runtime.py:206�[0m, in �[0;36mTritonExecutorRuntime._executor_model_export�[0;34m(self, path, export_name, ensemble, params, node_id, version)�[0m
E �[1;32m 198�[0m �[38;5;28;01mwith�[39;00m importlib�[38;5;241m.�[39mresources�[38;5;241m.�[39mpath(
E �[1;32m 199�[0m �[38;5;124m"�[39m�[38;5;124mmerlin.systems.triton.models�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mexecutor_model.py�[39m�[38;5;124m"�[39m
E �[1;32m 200�[0m ) �[38;5;28;01mas�[39;00m executor_model:
E �[1;32m 201�[0m copyfile(
E �[1;32m 202�[0m executor_model,
E �[1;32m 203�[0m os�[38;5;241m.�[39mpath�[38;5;241m.�[39mjoin(node_export_path, �[38;5;28mstr�[39m(version), �[38;5;124m"�[39m�[38;5;124mmodel.py�[39m�[38;5;124m"�[39m),
E �[1;32m 204�[0m )
E �[0;32m--> 206�[0m �[43mensemble�[49m�[38;5;241;43m.�[39;49m�[43msave�[49m�[43m(�[49m�[43mos�[49m�[38;5;241;43m.�[39;49m�[43mpath�[49m�[38;5;241;43m.�[39;49m�[43mjoin�[49m�[43m(�[49m�[43mnode_export_path�[49m�[43m,�[49m�[43m �[49m�[38;5;28;43mstr�[39;49m�[43m(�[49m�[43mversion�[49m�[43m)�[49m�[43m,�[49m�[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43mensemble�[39;49m�[38;5;124;43m"�[39;49m�[43m)�[49m�[43m)�[49m
E �[1;32m 208�[0m �[38;5;28;01mreturn�[39;00m config
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ensemble.py:107�[0m, in �[0;36mEnsemble.save�[0;34m(self, path)�[0m
E �[1;32m 105�[0m �[38;5;66;03m# dump out the full workflow (graph/stats/operators etc) using cloudpickle�[39;00m
E �[1;32m 106�[0m �[38;5;28;01mwith�[39;00m fs�[38;5;241m.�[39mopen(fs�[38;5;241m.�[39msep�[38;5;241m.�[39mjoin([path, �[38;5;124m"�[39m�[38;5;124mensemble.pkl�[39m�[38;5;124m"�[39m]), �[38;5;124m"�[39m�[38;5;124mwb�[39m�[38;5;124m"�[39m) �[38;5;28;01mas�[39;00m o:
E �[0;32m--> 107�[0m �[43mcloudpickle�[49m�[38;5;241;43m.�[39;49m�[43mdump�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43mo�[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/cloudpickle/cloudpickle_fast.py:55�[0m, in �[0;36mdump�[0;34m(obj, file, protocol, buffer_callback)�[0m
E �[1;32m 45�[0m �[38;5;28;01mdef�[39;00m �[38;5;21mdump�[39m(obj, file, protocol�[38;5;241m=�[39m�[38;5;28;01mNone�[39;00m, buffer_callback�[38;5;241m=�[39m�[38;5;28;01mNone�[39;00m):
E �[1;32m 46�[0m �[38;5;124;03m"""Serialize obj as bytes streamed into file�[39;00m
E �[1;32m 47�[0m
E �[1;32m 48�[0m �[38;5;124;03m protocol defaults to cloudpickle.DEFAULT_PROTOCOL which is an alias to�[39;00m
E �[0;32m (...)�[0m
E �[1;32m 53�[0m �[38;5;124;03m compatibility with older versions of Python.�[39;00m
E �[1;32m 54�[0m �[38;5;124;03m """�[39;00m
E �[0;32m---> 55�[0m �[43mCloudPickler�[49m�[43m(�[49m
E �[1;32m 56�[0m �[43m �[49m�[43mfile�[49m�[43m,�[49m�[43m �[49m�[43mprotocol�[49m�[38;5;241;43m=�[39;49m�[43mprotocol�[49m�[43m,�[49m�[43m �[49m�[43mbuffer_callback�[49m�[38;5;241;43m=�[39;49m�[43mbuffer_callback�[49m
E �[1;32m 57�[0m �[43m �[49m�[43m)�[49m�[38;5;241;43m.�[39;49m�[43mdump�[49m�[43m(�[49m�[43mobj�[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/cloudpickle/cloudpickle_fast.py:632�[0m, in �[0;36mCloudPickler.dump�[0;34m(self, obj)�[0m
E �[1;32m 630�[0m �[38;5;28;01mdef�[39;00m �[38;5;21mdump�[39m(�[38;5;28mself�[39m, obj):
E �[1;32m 631�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m--> 632�[0m �[38;5;28;01mreturn�[39;00m �[43mPickler�[49m�[38;5;241;43m.�[39;49m�[43mdump�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43mobj�[49m�[43m)�[49m
E �[1;32m 633�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mRuntimeError�[39;00m �[38;5;28;01mas�[39;00m e:
E �[1;32m 634�[0m �[38;5;28;01mif�[39;00m �[38;5;124m"�[39m�[38;5;124mrecursion�[39m�[38;5;124m"�[39m �[38;5;129;01min�[39;00m e�[38;5;241m.�[39margs[�[38;5;241m0�[39m]:
E
E File �[0;32m~/.local/lib/python3.8/site-packages/keras/engine/training.py:324�[0m, in �[0;36mModel.reduce__�[0;34m(self)�[0m
E �[1;32m 321�[0m �[38;5;28;01mdef�[39;00m �[38;5;21m__reduce__�[39m(�[38;5;28mself�[39m):
E �[1;32m 322�[0m �[38;5;28;01mif�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mbuilt:
E �[1;32m 323�[0m �[38;5;28;01mreturn�[39;00m (pickle_utils�[38;5;241m.�[39mdeserialize_model_from_bytecode,
E �[0;32m--> 324�[0m �[43mpickle_utils�[49m�[38;5;241;43m.�[39;49m�[43mserialize_model_as_bytecode�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m)�[49m)
E �[1;32m 325�[0m �[38;5;28;01melse�[39;00m:
E �[1;32m 326�[0m �[38;5;66;03m# SavedModel (and hence serialize_model_as_bytecode) only support�[39;00m
E �[1;32m 327�[0m �[38;5;66;03m# built models, but if the model is not built,�[39;00m
E �[0;32m (...)�[0m
E �[1;32m 331�[0m �[38;5;66;03m# Thus we call up the superclass hierarchy to get an implementation of�[39;00m
E �[1;32m 332�[0m �[38;5;66;03m# reduce that can pickle this Model as a plain Python object.�[39;00m
E �[1;32m 333�[0m �[38;5;28;01mreturn�[39;00m �[38;5;28msuper�[39m(Model, �[38;5;28mself�[39m)�[38;5;241m.�[39m__reduce
()
E
E File �[0;32m~/.local/lib/python3.8/site-packages/keras/saving/pickle_utils.py:64�[0m, in �[0;36mserialize_model_as_bytecode�[0;34m(model)�[0m
E �[1;32m 54�[0m �[38;5;124;03m"""Convert a Keras Model into a bytecode representation for pickling.�[39;00m
E �[1;32m 55�[0m
E �[1;32m 56�[0m �[38;5;124;03mArgs:�[39;00m
E �[0;32m (...)�[0m
E �[1;32m 61�[0m �[38;5;124;03m deserialize_from_bytecode.�[39;00m
E �[1;32m 62�[0m �[38;5;124;03m"""�[39;00m
E �[1;32m 63�[0m temp_dir �[38;5;241m=�[39m �[38;5;124mf�[39m�[38;5;124m"�[39m�[38;5;124mram://�[39m�[38;5;132;01m{�[39;00muuid�[38;5;241m.�[39muuid4()�[38;5;132;01m}�[39;00m�[38;5;124m"�[39m
E �[0;32m---> 64�[0m �[43mmodel�[49m�[38;5;241;43m.�[39;49m�[43msave�[49m�[43m(�[49m�[43mtemp_dir�[49m�[43m)�[49m
E �[1;32m 65�[0m b �[38;5;241m=�[39m io�[38;5;241m.�[39mBytesIO()
E �[1;32m 66�[0m �[38;5;28;01mwith�[39;00m tarfile�[38;5;241m.�[39mopen(fileobj�[38;5;241m=�[39mb, mode�[38;5;241m=�[39m�[38;5;124m"�[39m�[38;5;124mw�[39m�[38;5;124m"�[39m) �[38;5;28;01mas�[39;00m archive:
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/base.py:1189�[0m, in �[0;36mModel.save�[0;34m(self, export_path, include_optimizer, save_traces)�[0m
E �[1;32m 1187�[0m input_schema �[38;5;241m=�[39m �[38;5;28mself�[39m�[38;5;241m.�[39mschema
E �[1;32m 1188�[0m output_schema �[38;5;241m=�[39m get_output_schema(export_path)
E �[0;32m-> 1189�[0m �[43msave_merlin_metadata�[49m�[43m(�[49m�[43mexport_path�[49m�[43m,�[49m�[43m �[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43minput_schema�[49m�[43m,�[49m�[43m �[49m�[43moutput_schema�[49m�[43m)�[49m
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/models/io.py:36�[0m, in �[0;36msave_merlin_metadata�[0;34m(export_path, model, input_schema, output_schema)�[0m
E �[1;32m 34�[0m export_path �[38;5;241m=�[39m pathlib�[38;5;241m.�[39mPath(export_path)
E �[1;32m 35�[0m merlin_metadata_dir �[38;5;241m=�[39m export_path �[38;5;241m/�[39m _MERLIN_METADATA_DIR_NAME
E �[0;32m---> 36�[0m �[43mmerlin_metadata_dir�[49m�[38;5;241;43m.�[39;49m�[43mmkdir�[49m�[43m(�[49m�[43mexist_ok�[49m�[38;5;241;43m=�[39;49m�[38;5;28;43;01mTrue�[39;49;00m�[43m)�[49m
E �[1;32m 38�[0m �[38;5;28;01mif�[39;00m input_schema �[38;5;129;01mis�[39;00m �[38;5;129;01mnot�[39;00m �[38;5;28;01mNone�[39;00m:
E �[1;32m 39�[0m schema_to_tensorflow_metadata_json(
E �[1;32m 40�[0m input_schema,
E �[1;32m 41�[0m merlin_metadata_dir �[38;5;241m/�[39m �[38;5;124m"�[39m�[38;5;124minput_schema.json�[39m�[38;5;124m"�[39m,
E �[1;32m 42�[0m )
E
E File �[0;32m/usr/lib/python3.8/pathlib.py:1288�[0m, in �[0;36mPath.mkdir�[0;34m(self, mode, parents, exist_ok)�[0m
E �[1;32m 1286�[0m �[38;5;28mself�[39m�[38;5;241m.�[39m_raise_closed()
E �[1;32m 1287�[0m �[38;5;28;01mtry�[39;00m:
E �[0;32m-> 1288�[0m �[38;5;28;43mself�[39;49m�[38;5;241;43m.�[39;49m�[43m_accessor�[49m�[38;5;241;43m.�[39;49m�[43mmkdir�[49m�[43m(�[49m�[38;5;28;43mself�[39;49m�[43m,�[49m�[43m �[49m�[43mmode�[49m�[43m)�[49m
E �[1;32m 1289�[0m �[38;5;28;01mexcept�[39;00m �[38;5;167;01mFileNotFoundError�[39;00m:
E �[1;32m 1290�[0m �[38;5;28;01mif�[39;00m �[38;5;129;01mnot�[39;00m parents �[38;5;129;01mor�[39;00m �[38;5;28mself�[39m�[38;5;241m.�[39mparent �[38;5;241m==�[39m �[38;5;28mself�[39m:
E
E �[0;31mFileNotFoundError�[0m: [Errno 2] No such file or directory: 'ram:/1a1f2b6f-32e0-4a5c-8947-7fbeeb3ee8c7/.merlin'
E FileNotFoundError: [Errno 2] No such file or directory: 'ram:/1a1f2b6f-32e0-4a5c-8947-7fbeeb3ee8c7/.merlin'

../../../.local/lib/python3.8/site-packages/testbook/client.py:135: TestbookRuntimeError
----------------------------- Captured stderr call -----------------------------
2022-12-20 18:21:35,995 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-12-20 18:21:36,004 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-12-20 18:21:36,014 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-12-20 18:21:36,030 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
/usr/lib/python3.8/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 45 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
2022-12-20 18:21:55.973799: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 18:22:00.054717: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 18:22:00.054821: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 18:22:00.055601: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 1
2022-12-20 18:22:00.055659: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-12-20 18:22:00.056258: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 2
2022-12-20 18:22:00.056310: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-12-20 18:22:00.056908: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 3
2022-12-20 18:22:00.056957: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/usr/lib/python3.8/logging/init.py", line 2127, in shutdown
h.close()
File "/usr/local/lib/python3.8/dist-packages/absl/logging/init.py", line 934, in close
self.stream.close()
File "/usr/local/lib/python3.8/dist-packages/ipykernel/iostream.py", line 438, in close
self.watch_fd_thread.join()
AttributeError: 'OutStream' object has no attribute 'watch_fd_thread'
2022-12-20 18:22:58.198738: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 18:23:02.249505: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 18:23:02.249611: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 18:23:02.250373: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 1
2022-12-20 18:23:02.250433: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-12-20 18:23:02.251096: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 2
2022-12-20 18:23:02.251146: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-12-20 18:23:02.251772: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 3
2022-12-20 18:23:02.251820: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/usr/lib/python3.8/logging/init.py", line 2127, in shutdown
h.close()
File "/usr/local/lib/python3.8/dist-packages/absl/logging/init.py", line 934, in close
self.stream.close()
File "/usr/local/lib/python3.8/dist-packages/ipykernel/iostream.py", line 438, in close
self.watch_fd_thread.join()
AttributeError: 'OutStream' object has no attribute 'watch_fd_thread'
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

tests/unit/examples/test_z_legacy_notebooks.py::test_movielens_example
tests/unit/examples/test_z_legacy_notebooks.py::test_movielens_example
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/nvtabular/tools/data_gen.py:126: DeprecationWarning: np.long is a deprecated alias for np.compat.long. To silence this warning, use np.compat.long by itself. In the likely event your code does not need to work on Python 2 you can use the builtin int for which np.compat.long is itself an alias. Doing this will not modify any behaviour and is safe. When replacing np.long, you may wish to use e.g. np.int64 or np.int32 to specify the precision. If you wish to review your current use, check the release note link for additional information.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
col_size + 1, dtype=np.long, min_val=col.multi_min, max_val=col.multi_max

tests/unit/examples/test_z_legacy_notebooks.py: 12 warnings
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/nvtabular/tools/data_gen.py:140: DeprecationWarning: np.long is a deprecated alias for np.compat.long. To silence this warning, use np.compat.long by itself. In the likely event your code does not need to work on Python 2 you can use the builtin int for which np.compat.long is itself an alias. Doing this will not modify any behaviour and is safe. When replacing np.long, you may wish to use e.g. np.int64 or np.int32 to specify the precision. If you wish to review your current use, check the release note link for additional information.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
col_size, dtype=np.long, min_val=col.min_val, max_val=col.cardinality

tests/unit/examples/test_z_legacy_notebooks.py::test_movielens_example
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/nvtabular/loader/init.py:19: DeprecationWarning: The nvtabular.loader module has moved to a new repository, at https://github.com/NVIDIA-Merlin/dataloader . Support for importing from nvtabular.loader is deprecated, and will be removed in a future version. Please update your imports to refer to merlinloader.
warnings.warn(

tests/unit/examples/test_z_legacy_notebooks.py::test_movielens_example
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/init.py:23: DeprecationWarning: The merlin.loader package has been moved to merlin.dataloader. Please update your imports, importing from merlin.loader is deprecated and will be removed in a future version
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Cover

.tox/test-gpu/lib/python3.8/site-packages/merlin/core/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/_version.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/compat.py 10 4 60%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/dispatch.py 366 166 55%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/protocols.py 99 45 55%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/utils.py 197 98 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/init.py 5 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/base_operator.py 121 16 87%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/dictarray.py 55 31 44%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/executors.py 141 26 82%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/graph.py 99 26 74%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/node.py 344 110 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/concat_columns.py 17 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/selection.py 22 1 95%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/subset_columns.py 12 2 83%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/subtraction.py 21 11 48%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/selector.py 101 26 74%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/_version.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/loader_base.py 471 104 78%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/tensorflow.py 114 38 67%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/tf_utils.py 57 27 53%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/torch.py 66 33 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/aliccp/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/aliccp/dataset.py 141 102 28%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/booking/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/booking/dataset.py 127 100 21%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/dressipi/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/dressipi/dataset.py 45 37 18%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/synthetic.py 155 61 61%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/csv.py 57 11 81%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dask.py 181 96 47%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataframe_engine.py 61 36 41%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataframe_iter.py 21 2 90%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py 347 123 65%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset_engine.py 37 8 78%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/fsspec_utils.py 127 108 15%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/hugectr.py 45 35 22%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/parquet.py 624 335 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/shuffle.py 38 12 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/worker.py 80 11 86%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/writer.py 190 64 66%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/writer_factory.py 18 5 72%
.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py 1 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/torch.py 1 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/_version.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/api.py 14 5 64%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/config/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/config/schema.py 62 16 74%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/io.py 15 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/loader/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/loader/backend.py 16 5 69%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/init.py 69 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/cross.py 44 28 36%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/dlrm.py 49 8 84%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/experts.py 158 122 23%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/interaction.py 108 55 49%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/mlp.py 117 58 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/optimizer.py 173 127 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/retrieval/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/retrieval/base.py 175 95 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/retrieval/matrix_factorization.py 35 19 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/retrieval/two_tower.py 30 4 87%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/base.py 29 7 76%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/cross_batch.py 46 31 33%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/in_batch.py 35 12 66%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/queue.py 115 99 14%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/aggregation.py 241 107 56%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/base.py 242 108 55%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/combinators.py 426 150 65%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/encoder.py 182 125 31%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/index.py 106 71 33%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/prediction.py 50 19 62%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/tabular.py 280 71 75%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/distributed/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/distributed/backend.py 9 2 78%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/inputs/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/inputs/base.py 64 39 39%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/inputs/continuous.py 39 4 90%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/inputs/embedding.py 458 154 66%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/loader.py 128 70 45%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/losses/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/losses/base.py 9 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/losses/listwise.py 13 1 92%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/losses/pairwise.py 115 57 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/metrics/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/metrics/topk.py 198 82 59%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/base.py 782 350 55%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/benchmark.py 16 6 62%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/ranking.py 67 43 36%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/retrieval.py 78 44 44%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/utils.py 10 2 80%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/base.py 123 90 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/classification.py 91 51 44%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/contrastive.py 147 107 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/regression.py 9 2 78%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/sampling/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/sampling/base.py 78 41 47%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/sampling/in_batch.py 37 22 41%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/sampling/popularity.py 27 17 37%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/topk.py 98 63 36%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/base.py 207 108 48%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/classification.py 68 22 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/multi.py 7 1 86%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/next_item.py 59 33 44%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/regression.py 35 19 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/retrieval.py 73 31 58%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transformers/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transformers/block.py 102 55 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transformers/transforms.py 87 29 67%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/bias.py 111 77 31%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/features.py 435 346 20%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/noise.py 43 28 35%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/regularization.py 17 6 65%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/sequence.py 302 227 25%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/tensor.py 165 79 52%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/typing.py 7 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/batch_utils.py 85 12 86%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/repr_utils.py 69 48 30%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/search_utils.py 34 22 35%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/tf_utils.py 209 141 33%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/constants.py 3 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/dataset.py 38 18 53%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/dependencies.py 26 19 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/doc_utils.py 10 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/example_utils.py 31 10 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/misc_utils.py 118 90 24%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/nvt_utils.py 27 24 11%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/registry.py 101 31 69%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/schema_utils.py 90 39 57%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/io/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/io/proto_utils.py 20 4 80%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/io/schema_bp.py 306 7 98%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/io/tensorflow_metadata.py 190 33 83%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/schema.py 229 56 76%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py 82 5 94%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/init.py 6 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/_version.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/dictarray.py 172 116 33%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ensemble.py 46 3 93%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/node.py 23 2 91%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/op_runner.py 26 19 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/init.py 11 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/compat.py 32 8 75%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/faiss.py 77 15 81%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/feast.py 118 56 53%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/fil.py 221 125 43%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/operator.py 79 32 59%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/pytorch.py 49 32 35%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/session_filter.py 45 28 38%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/softmax_sampling.py 51 21 59%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/tensorflow.py 73 23 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/unroll_features.py 50 21 58%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/workflow.py 39 11 72%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/init.py 1 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/base_runtime.py 11 2 82%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/init.py 1 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/fil.py 95 66 31%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/operator.py 12 1 92%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/pytorch.py 62 37 40%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/tensorflow.py 53 4 92%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/workflow.py 47 15 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/runtime.py 90 4 96%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/model_registry.py 16 8 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/init.py 49 34 31%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/conversions.py 143 120 16%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/export.py 268 210 22%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/models/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py 72 26 64%

TOTAL 15503 7158 54%

=========================== short test summary info ============================
SKIPPED [1] tests/unit/examples/test_scaling_criteo_merlin_models_hugectr.py:7: could not import 'hugectr': No module named 'hugectr'
======= 2 failed, 4 passed, 1 skipped, 51 warnings in 420.62s (0:07:00) ========
ERROR: InvocationError for command /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/bin/python -m pytest --cov-report term --cov merlin -rxs tests/unit (exited with code 1)
___________________________________ summary ____________________________________
ERROR: test-gpu: commands failed
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/Merlin/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[workspace] $ /bin/bash /tmp/jenkins12205923634640007256.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #771 of commit 4a7b3d05fb145f47d45d303af17c62619be3cac2, no merge conflicts.
Running as SYSTEM
Setting status of 4a7b3d05fb145f47d45d303af17c62619be3cac2 to PENDING with url http://merlin-infra1.nvidia.com:8080/job/merlin_merlin/707/ and message: 'Pending'
Using context: Jenkins
Building on the built-in node in workspace /var/jenkins_home/jobs/merlin_merlin/workspace
using credential systems-login
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/Merlin # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/Merlin
 > git --version # timeout=10
using GIT_ASKPASS to set credentials login for merlin-systems
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/Merlin +refs/pull/771/*:refs/remotes/origin/pr/771/* # timeout=10
 > git rev-parse 4a7b3d05fb145f47d45d303af17c62619be3cac2^{commit} # timeout=10
Checking out Revision 4a7b3d05fb145f47d45d303af17c62619be3cac2 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4a7b3d05fb145f47d45d303af17c62619be3cac2 # timeout=10
Commit message: "adding tf gpu allocator env var"
 > git rev-list --no-walk 9d6ed0540beb47608af1d13c8dac8342d11293d4 # timeout=10
[workspace] $ /bin/bash /tmp/jenkins4252991186866989862.sh
GLOB sdist-make: /var/jenkins_home/workspace/merlin_merlin/merlin/setup.py
test-gpu recreate: /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
test-gpu installdeps: pytest, pytest-cov
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu inst: /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/.tmp/package/1/merlin-0.0.1.zip
WARNING: Discarding $PYTHONPATH from environment, to override specify PYTHONPATH in 'passenv' in your configuration.
test-gpu installed: absl-py==1.2.0,aiohttp==3.8.1,aiosignal==1.2.0,alabaster==0.7.12,alembic==1.9.0,anyio==3.6.1,argon2-cffi==21.3.0,argon2-cffi-bindings==21.2.0,astroid==2.5.6,asttokens==2.0.8,astunparse==1.6.3,asv==0.5.1,asvdb==0.4.2,async-timeout==4.0.2,attrs==22.1.0,autopage==0.5.1,awscli==1.27.33,Babel==2.10.3,backcall==0.2.0,beautifulsoup4==4.11.1,betterproto==1.2.5,black==22.6.0,bleach==5.0.1,boto3==1.24.75,botocore==1.29.33,Brotli==1.0.9,cachetools==5.2.0,certifi==2019.11.28,cffi==1.15.1,chardet==3.0.4,charset-normalizer==2.1.1,clang==5.0,click==8.1.3,cliff==4.1.0,cloudpickle==2.2.0,cmaes==0.9.0,cmake==3.24.1.1,cmd2==2.4.2,colorama==0.4.4,colorlog==6.7.0,contourpy==1.0.5,coverage==7.0.0,cuda-python==11.7.1,cupy-cuda117==10.6.0,cycler==0.11.0,Cython==0.29.32,dask==2022.1.1,dbus-python==1.2.16,debugpy==1.6.3,decorator==5.1.1,defusedxml==0.7.1,dill==0.3.5.1,distlib==0.3.6,distributed==2022.5.1,distro==1.7.0,dm-tree==0.1.6,docker-pycreds==0.4.0,docutils==0.16,emoji==1.7.0,entrypoints==0.4,execnet==1.9.0,executing==1.0.0,faiss==1.7.2,faiss-gpu==1.7.2,fastai==2.7.9,fastapi==0.85.0,fastavro==1.6.1,fastcore==1.5.27,fastdownload==0.0.7,fastjsonschema==2.16.1,fastprogress==1.0.3,fastrlock==0.8,feast==0.19.4,fiddle==0.2.2,filelock==3.8.0,flatbuffers==1.12,fonttools==4.37.3,frozenlist==1.3.1,fsspec==2022.5.0,gast==0.4.0,gevent==21.12.0,geventhttpclient==2.0.2,gitdb==4.0.9,GitPython==3.1.27,google==3.0.0,google-api-core==2.10.1,google-auth==2.11.1,google-auth-oauthlib==0.4.6,google-pasta==0.2.0,googleapis-common-protos==1.52.0,graphviz==0.20.1,greenlet==1.1.3,grpcio==1.41.0,grpcio-channelz==1.49.0,grpcio-reflection==1.48.1,grpclib==0.4.3,h11==0.13.0,h2==4.1.0,h5py==3.7.0,HeapDict==1.0.1,horovod==0.26.1,hpack==4.0.0,httptools==0.5.0,hugectr2onnx==0.0.0,huggingface-hub==0.9.1,hyperframe==6.0.1,idna==2.8,imagesize==1.4.1,implicit==0.6.1,importlib-metadata==4.12.0,importlib-resources==5.9.0,iniconfig==1.1.1,ipykernel==6.15.3,ipython==8.5.0,ipython-genutils==0.2.0,ipywidgets==7.7.0,jedi==0.18.1,Jinja2==3.1.2,jmespath==1.0.1,joblib==1.2.0,json5==0.9.10,jsonschema==4.16.0,jupyter-cache==0.4.3,jupyter-core==4.11.1,jupyter-server==1.18.1,jupyter-server-mathjax==0.2.5,jupyter-sphinx==0.3.2,jupyter_client==7.3.5,jupyterlab==3.4.7,jupyterlab-pygments==0.2.2,jupyterlab-widgets==1.1.0,jupyterlab_server==2.15.1,keras==2.9.0,Keras-Preprocessing==1.1.2,kiwisolver==1.4.4,lazy-object-proxy==1.8.0,libclang==14.0.6,libcst==0.4.7,lightfm==1.16,lightgbm==3.3.2,linkify-it-py==1.0.3,llvmlite==0.39.1,locket==1.0.0,lxml==4.9.1,Mako==1.2.4,Markdown==3.4.1,markdown-it-py==1.1.0,MarkupSafe==2.1.1,matplotlib==3.6.0,matplotlib-inline==0.1.6,mdit-py-plugins==0.2.8,merlin==0.0.1,merlin-core==0.6.0+1.g5926fcf,merlin-models==0.7.0+11.g280956aa4,merlin-systems==0.5.0+4.g15074ad,mistune==2.0.4,mmh3==3.0.0,mpi4py==3.1.3,msgpack==1.0.4,multidict==6.0.2,mypy-extensions==0.4.3,myst-nb==0.13.2,myst-parser==0.15.2,natsort==8.1.0,nbclassic==0.4.3,nbclient==0.6.8,nbconvert==7.0.0,nbdime==3.1.1,nbformat==5.5.0,nest-asyncio==1.5.5,ninja==1.10.2.3,notebook==6.4.12,notebook-shim==0.1.0,numba==0.56.2,numpy==1.22.4,nvidia-pyindex==1.0.9,# Editable install with no version control (nvtabular==1.4.0+8.g95e12d347),-e /usr/local/lib/python3.8/dist-packages,nvtx==0.2.5,oauthlib==3.2.1,oldest-supported-numpy==2022.8.16,onnx==1.12.0,onnxruntime==1.11.1,opt-einsum==3.3.0,optuna==3.0.5,packaging==21.3,pandas==1.3.5,pandavro==1.5.2,pandocfilters==1.5.0,parso==0.8.3,partd==1.3.0,pathtools==0.1.2,pbr==5.11.0,pexpect==4.8.0,pickleshare==0.7.5,Pillow==9.2.0,pkgutil_resolve_name==1.3.10,platformdirs==2.5.2,plotly==5.11.0,pluggy==1.0.0,prettytable==3.5.0,prometheus-client==0.14.1,promise==2.3,prompt-toolkit==3.0.31,proto-plus==1.19.6,protobuf==3.19.5,psutil==5.9.2,ptyprocess==0.7.0,pure-eval==0.2.2,py==1.11.0,pyarrow==7.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pybind11==2.10.0,pycparser==2.21,pydantic==1.10.2,pydot==1.4.2,Pygments==2.13.0,PyGObject==3.36.0,pynvml==11.4.1,pyparsing==3.0.9,pyperclip==1.8.2,pyrsistent==0.18.1,pytest==7.1.3,pytest-cov==4.0.0,pytest-xdist==3.1.0,python-apt==2.0.0+ubuntu0.20.4.8,python-dateutil==2.8.2,python-dotenv==0.21.0,python-rapidjson==1.8,pytz==2022.2.1,PyYAML==5.4.1,pyzmq==24.0.0,regex==2022.9.13,requests==2.22.0,requests-oauthlib==1.3.1,requests-unixsocket==0.2.0,rsa==4.7.2,s3fs==2022.2.0,s3transfer==0.6.0,sacremoses==0.0.53,scikit-build==0.15.0,scikit-learn==1.1.2,scipy==1.8.1,seedir==0.3.0,Send2Trash==1.8.0,sentry-sdk==1.9.8,setproctitle==1.3.2,setuptools-scm==7.0.5,shortuuid==1.0.9,six==1.15.0,sklearn==0.0,smmap==5.0.0,sniffio==1.3.0,snowballstemmer==2.2.0,sortedcontainers==2.4.0,soupsieve==2.3.2.post1,Sphinx==5.3.0,sphinx-multiversion==0.2.4,sphinx-togglebutton==0.3.1,sphinx_external_toc==0.3.0,sphinxcontrib-applehelp==1.0.2,sphinxcontrib-copydirs @ git+https://github.com/mikemckiernan/sphinxcontrib-copydirs.git@bd8c5d79b3f91cf5f1bb0d6995aeca3fe84b670e,sphinxcontrib-devhelp==1.0.2,sphinxcontrib-htmlhelp==2.0.0,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-qthelp==1.0.3,sphinxcontrib-serializinghtml==1.1.5,SQLAlchemy==1.4.45,stack-data==0.5.0,starlette==0.20.4,stevedore==4.1.1,stringcase==1.2.0,supervisor==4.1.0,tabulate==0.8.10,tblib==1.7.0,tdqm==0.0.1,tenacity==8.0.1,tensorboard==2.9.1,tensorboard-data-server==0.6.1,tensorboard-plugin-wit==1.8.1,tensorflow==2.9.2,tensorflow-estimator==2.9.0,tensorflow-gpu==2.9.2,tensorflow-io-gcs-filesystem==0.27.0,tensorflow-metadata==1.10.0,termcolor==2.0.1,terminado==0.15.0,testbook==0.4.2,threadpoolctl==3.1.0,tinycss2==1.1.1,tokenizers==0.10.3,toml==0.10.2,tomli==2.0.1,toolz==0.12.0,torch==1.12.1+cu113,torchmetrics==0.3.2,tornado==6.2,tox==3.26.0,tqdm==4.64.1,traitlets==5.4.0,transformers==4.12.0,transformers4rec==0.1.12+2.gbcc939255,treelite==2.3.0,treelite-runtime==2.3.0,tritonclient==2.25.0,typing-inspect==0.8.0,typing_extensions==4.3.0,uc-micro-py==1.0.1,urllib3==1.26.12,uvicorn==0.18.3,uvloop==0.17.0,versioneer==0.20,virtualenv==20.16.5,wandb==0.13.3,watchfiles==0.17.0,wcwidth==0.2.5,webencodings==0.5.1,websocket-client==1.4.1,websockets==10.3,Werkzeug==2.2.2,widgetsnbextension==3.6.0,wrapt==1.12.1,xgboost==1.6.2,yarl==1.8.1,zict==2.2.0,zipp==3.8.1,zope.event==4.5.0,zope.interface==5.4.0
test-gpu run-test-pre: PYTHONHASHSEED='4181021585'
test-gpu run-test: commands[0] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/systems.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/systems.git
  Cloning https://github.com/NVIDIA-Merlin/systems.git to /tmp/pip-req-build-w6rulrhw
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/systems.git /tmp/pip-req-build-w6rulrhw
  Resolved https://github.com/NVIDIA-Merlin/systems.git to commit df53c726ce5b90c00e1310cb80570a8622f8f785
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting treelite==2.4.0
  Downloading treelite-2.4.0-py3-none-manylinux2014_x86_64.whl (852 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 852.7/852.7 kB 1.9 MB/s eta 0:00:00
Requirement already satisfied: requests<3,>=2.10 in /usr/lib/python3/dist-packages (from merlin-systems==0.7.0+39.gdf53c72) (2.22.0)
Collecting treelite-runtime==2.4.0
  Downloading treelite_runtime-2.4.0-py3-none-manylinux2014_x86_64.whl (191 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 191.5/191.5 kB 3.6 MB/s eta 0:00:00
Requirement already satisfied: nvtabular>=1.0.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-systems==0.7.0+39.gdf53c72) (1.1.1)
Requirement already satisfied: merlin-core>=0.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-systems==0.7.0+39.gdf53c72) (0.3.0+12.g78ecddd)
Requirement already satisfied: numpy in /var/jenkins_home/.local/lib/python3.8/site-packages (from treelite==2.4.0->merlin-systems==0.7.0+39.gdf53c72) (1.20.3)
Requirement already satisfied: scipy in /usr/local/lib/python3.8/dist-packages (from treelite==2.4.0->merlin-systems==0.7.0+39.gdf53c72) (1.8.1)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.10.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (21.3)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (3.19.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (0.55.1)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (7.0.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.3.5)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (4.64.1)
Requirement already satisfied: dask>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2022.3.0)
Requirement already satisfied: distributed>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2022.3.0)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.2.5)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (0.12.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.2.0)
Requirement already satisfied: fsspec>=0.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2022.5.0)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.0.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.0.4)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.7.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (5.8.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.4.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (8.1.3)
Collecting llvmlite<0.39,>=0.38.0rc1
  Downloading llvmlite-0.38.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 34.5/34.5 MB 52.7 MB/s eta 0:00:00
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (65.6.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (3.0.9)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2022.2.1)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.8.2)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2021.11.2->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-systems==0.7.0+39.gdf53c72) (6.0.1)
Building wheels for collected packages: merlin-systems
  Building wheel for merlin-systems (pyproject.toml): started
  Building wheel for merlin-systems (pyproject.toml): finished with status 'done'
  Created wheel for merlin-systems: filename=merlin_systems-0.7.0+39.gdf53c72-py3-none-any.whl size=100698 sha256=5e947a79d2a8136baab785a002fac548a16d86fb78423c6a6c7cddda3e637a9f
  Stored in directory: /tmp/pip-ephem-wheel-cache-w53kt8_s/wheels/d3/db/b8/99d510a979c278774eda4142f1c0643c93b7b2674aff321c16
Successfully built merlin-systems
Installing collected packages: llvmlite, treelite-runtime, treelite, merlin-systems
  Attempting uninstall: llvmlite
    Found existing installation: llvmlite 0.39.1
    Not uninstalling llvmlite at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'llvmlite'. No files were found to uninstall.
  Attempting uninstall: treelite-runtime
    Found existing installation: treelite-runtime 2.3.0
    Not uninstalling treelite-runtime at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'treelite-runtime'. No files were found to uninstall.
  Attempting uninstall: treelite
    Found existing installation: treelite 2.3.0
    Not uninstalling treelite at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'treelite'. No files were found to uninstall.
  Attempting uninstall: merlin-systems
    Found existing installation: merlin-systems 0.5.0+4.g15074ad
    Not uninstalling merlin-systems at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'merlin-systems'. No files were found to uninstall.
Successfully installed llvmlite-0.38.1 merlin-systems-0.7.0+39.gdf53c72 treelite-2.4.0 treelite-runtime-2.4.0
test-gpu run-test: commands[1] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/models.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/models.git
  Cloning https://github.com/NVIDIA-Merlin/models.git to /tmp/pip-req-build-eecg4xwb
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/models.git /tmp/pip-req-build-eecg4xwb
  Resolved https://github.com/NVIDIA-Merlin/models.git to commit 130c103be37a1817dd044110a5629baf28c841f1
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting merlin-dataloader>=0.0.2
  Downloading merlin-dataloader-0.0.3.tar.gz (48 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.3/48.3 kB 1.5 MB/s eta 0:00:00
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: merlin-core>=0.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-models==0.9.0+72.g130c103b) (0.3.0+12.g78ecddd)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.10.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (21.3)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (3.19.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (0.55.1)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (7.0.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.3.5)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (4.64.1)
Requirement already satisfied: dask>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (2022.3.0)
Requirement already satisfied: distributed>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (2022.3.0)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.2.5)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (0.12.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (2.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.2.0)
Requirement already satisfied: fsspec>=0.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (2022.5.0)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (2.0.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.0.4)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.7.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (5.8.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (2.4.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (8.1.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (65.6.3)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (3.0.9)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (2022.2.1)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (2.8.2)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2021.11.2->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->merlin-models==0.9.0+72.g130c103b) (6.0.1)
Building wheels for collected packages: merlin-models, merlin-dataloader
  Building wheel for merlin-models (pyproject.toml): started
  Building wheel for merlin-models (pyproject.toml): finished with status 'done'
  Created wheel for merlin-models: filename=merlin_models-0.9.0+72.g130c103b-py3-none-any.whl size=357462 sha256=2fd8831fa625ae27d86cd99f9edd806244787e70e4825a6523ee451dd1b8b9b1
  Stored in directory: /tmp/pip-ephem-wheel-cache-4b19hjju/wheels/5a/43/99/d50fe2c33b4f4686db73207ce3865e0d6be6609ffb03abade5
  Building wheel for merlin-dataloader (pyproject.toml): started
  Building wheel for merlin-dataloader (pyproject.toml): finished with status 'done'
  Created wheel for merlin-dataloader: filename=merlin_dataloader-0.0.3-py3-none-any.whl size=37647 sha256=dc3ea3c88c4fab3135b9e028a71c1b0cc6cbd1354c215a711232e72670003f3b
  Stored in directory: /tmp/pip-ephem-wheel-cache-4b19hjju/wheels/1c/a3/4a/0feebb30e0c8cb7ba7046544390b43c7017a2195232f5305a1
Successfully built merlin-models merlin-dataloader
Installing collected packages: merlin-dataloader, merlin-models
  Attempting uninstall: merlin-models
    Found existing installation: merlin-models 0.7.0+11.g280956aa4
    Not uninstalling merlin-models at /usr/local/lib/python3.8/dist-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'merlin-models'. No files were found to uninstall.
Successfully installed merlin-dataloader-0.0.3 merlin-models-0.9.0+72.g130c103b
test-gpu run-test: commands[2] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/NVTabular.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/NVTabular.git
  Cloning https://github.com/NVIDIA-Merlin/NVTabular.git to /tmp/pip-req-build-ba1ph5fc
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/NVTabular.git /tmp/pip-req-build-ba1ph5fc
  Resolved https://github.com/NVIDIA-Merlin/NVTabular.git to commit 985510ef0f529aa54a8ac29414d3ed71542c4c62
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: merlin-dataloader>=0.0.2 in ./.tox/test-gpu/lib/python3.8/site-packages (from nvtabular==1.6.0+22.g985510ef) (0.0.3)
Requirement already satisfied: scipy in /usr/local/lib/python3.8/dist-packages (from nvtabular==1.6.0+22.g985510ef) (1.8.1)
Requirement already satisfied: merlin-core>=0.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from nvtabular==1.6.0+22.g985510ef) (0.3.0+12.g78ecddd)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.10.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (21.3)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (3.19.5)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.55.1)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (7.0.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.3.5)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (4.64.1)
Requirement already satisfied: dask>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2022.3.0)
Requirement already satisfied: distributed>=2021.11.2 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2022.3.0)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.2.5)
Requirement already satisfied: numpy<1.25.0,>=1.17.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (from scipy->nvtabular==1.6.0+22.g985510ef) (1.20.3)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.12.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.2.0)
Requirement already satisfied: fsspec>=0.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2022.5.0)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.0.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.0.4)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.7.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (5.8.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.4.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (8.1.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (65.6.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (3.0.9)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2022.2.1)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.8.2)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2021.11.2->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.2.0->nvtabular==1.6.0+22.g985510ef) (6.0.1)
Building wheels for collected packages: nvtabular
  Building wheel for nvtabular (pyproject.toml): started
  Building wheel for nvtabular (pyproject.toml): finished with status 'done'
  Created wheel for nvtabular: filename=nvtabular-1.6.0+22.g985510ef-cp38-cp38-linux_x86_64.whl size=257120 sha256=8661d8dc46fd3cc197ca57beb69f293f96f352a301dba2ac6ad02c3d903c3c25
  Stored in directory: /tmp/pip-ephem-wheel-cache-dzq8m4rv/wheels/c2/16/76/39994bff39d812513de5b5572bff0903b9eb8f6c645b44cedc
Successfully built nvtabular
Installing collected packages: nvtabular
  Attempting uninstall: nvtabular
    Found existing installation: nvtabular 1.1.1
    Not uninstalling nvtabular at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'nvtabular'. No files were found to uninstall.
Successfully installed nvtabular-1.6.0+22.g985510ef
test-gpu run-test: commands[3] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/core.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/core.git
  Cloning https://github.com/NVIDIA-Merlin/core.git to /tmp/pip-req-build-v8e575ds
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/core.git /tmp/pip-req-build-v8e575ds
  Resolved https://github.com/NVIDIA-Merlin/core.git to commit 73c2afc277967015e2f334b8c57a9e8789fa19bd
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (2022.5.0)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (2022.3.0)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (1.10.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (21.3)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (0.55.1)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (7.0.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (1.3.5)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (4.64.1)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (3.19.5)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core==0.9.0+19.g73c2afc) (1.2.5)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core==0.9.0+19.g73c2afc) (2022.3.0)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (1.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (0.12.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (2.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (1.2.0)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (2.0.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (1.0.4)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (1.7.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (5.8.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (2.4.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (8.1.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.9.0+19.g73c2afc) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.9.0+19.g73c2afc) (65.6.3)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core==0.9.0+19.g73c2afc) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core==0.9.0+19.g73c2afc) (3.0.9)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (2022.2.1)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (2.8.2)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core==0.9.0+19.g73c2afc) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core==0.9.0+19.g73c2afc) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core==0.9.0+19.g73c2afc) (6.0.1)
Building wheels for collected packages: merlin-core
  Building wheel for merlin-core (pyproject.toml): started
  Building wheel for merlin-core (pyproject.toml): finished with status 'done'
  Created wheel for merlin-core: filename=merlin_core-0.9.0+19.g73c2afc-py3-none-any.whl size=119766 sha256=4ca44c94059442133b58d6795bc377f86058f4818ece853b8c7714c3e8f5d4ec
  Stored in directory: /tmp/pip-ephem-wheel-cache-v04kljj4/wheels/c8/38/16/a6968787eafcec5fa772148af8408b089562f71af0752e8e84
Successfully built merlin-core
Installing collected packages: merlin-core
  Attempting uninstall: merlin-core
    Found existing installation: merlin-core 0.3.0+12.g78ecddd
    Not uninstalling merlin-core at /var/jenkins_home/.local/lib/python3.8/site-packages, outside environment /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu
    Can't uninstall 'merlin-core'. No files were found to uninstall.
Successfully installed merlin-core-0.9.0+19.g73c2afc
test-gpu run-test: commands[4] | python -m pip install --upgrade git+https://github.com/NVIDIA-Merlin/dataloader.git
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Collecting git+https://github.com/NVIDIA-Merlin/dataloader.git
  Cloning https://github.com/NVIDIA-Merlin/dataloader.git to /tmp/pip-req-build-j22scepy
  Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA-Merlin/dataloader.git /tmp/pip-req-build-j22scepy
  Resolved https://github.com/NVIDIA-Merlin/dataloader.git to commit d4e6c1bd9eaaaaca87b336a5da835cb5e0bc5df3
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: merlin-core>=0.8.0 in ./.tox/test-gpu/lib/python3.8/site-packages (from merlin-dataloader==0.0.2+28.gd4e6c1b) (0.9.0+19.g73c2afc)
Requirement already satisfied: fsspec==2022.5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2022.5.0)
Requirement already satisfied: distributed>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2022.3.0)
Requirement already satisfied: tensorflow-metadata>=1.2.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.10.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (21.3)
Requirement already satisfied: numba>=0.54 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.55.1)
Requirement already satisfied: pyarrow>=5.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (7.0.0)
Requirement already satisfied: pandas<1.4.0dev0,>=1.2.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.3.5)
Requirement already satisfied: tqdm>=4.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (4.64.1)
Requirement already satisfied: protobuf>=3.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (3.19.5)
Requirement already satisfied: betterproto<2.0.0 in /usr/local/lib/python3.8/dist-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.2.5)
Requirement already satisfied: dask>=2022.3.0 in /var/jenkins_home/.local/lib/python3.8/site-packages (from merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2022.3.0)
Requirement already satisfied: grpclib in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.4.3)
Requirement already satisfied: stringcase in /usr/local/lib/python3.8/dist-packages (from betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.2.0)
Requirement already satisfied: toolz>=0.8.2 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.12.0)
Requirement already satisfied: cloudpickle>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.2.0)
Requirement already satisfied: pyyaml>=5.3.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg (from dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (5.4.1)
Requirement already satisfied: partd>=0.3.10 in /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg (from dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.2.0)
Requirement already satisfied: tornado>=6.0.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (6.1)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (3.1.2)
Requirement already satisfied: zict>=0.1.3 in /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.0.0)
Requirement already satisfied: msgpack>=0.6.0 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.0.4)
Requirement already satisfied: tblib>=1.6.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.7.0)
Requirement already satisfied: psutil>=5.0 in /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (5.8.0)
Requirement already satisfied: sortedcontainers!=2.0.0,!=2.0.1 in /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.4.0)
Requirement already satisfied: click>=6.6 in /usr/local/lib/python3.8/dist-packages (from distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (8.1.3)
Requirement already satisfied: llvmlite<0.39,>=0.38.0rc1 in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.38.1)
Requirement already satisfied: setuptools in ./.tox/test-gpu/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (65.6.3)
Requirement already satisfied: numpy<1.22,>=1.18 in /var/jenkins_home/.local/lib/python3.8/site-packages (from numba>=0.54->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.20.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.8/dist-packages (from packaging->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (3.0.9)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2022.2.1)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.8.2)
Requirement already satisfied: absl-py<2.0.0,>=0.9 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.2.0)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.8/dist-packages (from tensorflow-metadata>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.52.0)
Requirement already satisfied: locket in /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg (from partd>=0.3.10->dask>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (0.2.1)
Requirement already satisfied: six>=1.5 in /var/jenkins_home/.local/lib/python3.8/site-packages (from python-dateutil>=2.7.3->pandas<1.4.0dev0,>=1.2.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.15.0)
Requirement already satisfied: heapdict in /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg (from zict>=0.1.3->distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (1.0.1)
Requirement already satisfied: multidict in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (6.0.2)
Requirement already satisfied: h2<5,>=3.1.0 in /usr/local/lib/python3.8/dist-packages (from grpclib->betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (4.1.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.8/dist-packages (from jinja2->distributed>=2022.3.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (2.1.1)
Requirement already satisfied: hpack<5,>=4.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (4.0.0)
Requirement already satisfied: hyperframe<7,>=6.0 in /usr/local/lib/python3.8/dist-packages (from h2<5,>=3.1.0->grpclib->betterproto<2.0.0->merlin-core>=0.8.0->merlin-dataloader==0.0.2+28.gd4e6c1b) (6.0.1)
Building wheels for collected packages: merlin-dataloader
  Building wheel for merlin-dataloader (pyproject.toml): started
  Building wheel for merlin-dataloader (pyproject.toml): finished with status 'done'
  Created wheel for merlin-dataloader: filename=merlin_dataloader-0.0.2+28.gd4e6c1b-py3-none-any.whl size=41443 sha256=bfa3652cf9d6c539854ed1940f88c6f2dc97a1bf8d57185e91c4d302244bb348
  Stored in directory: /tmp/pip-ephem-wheel-cache-zq328tqs/wheels/de/f5/d9/251909f4627d2920fb15548f5ffd6daf1bf24c3c56bb4977b1
Successfully built merlin-dataloader
Installing collected packages: merlin-dataloader
  Attempting uninstall: merlin-dataloader
    Found existing installation: merlin-dataloader 0.0.3
    Uninstalling merlin-dataloader-0.0.3:
      Successfully uninstalled merlin-dataloader-0.0.3
Successfully installed merlin-dataloader-0.0.2+28.gd4e6c1b
test-gpu run-test: commands[5] | python -m pytest --cov-report term --cov merlin -rxs tests/unit
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.3, pluggy-1.0.0
cachedir: .tox/test-gpu/.pytest_cache
rootdir: /var/jenkins_home/workspace/merlin_merlin/merlin
plugins: anyio-3.5.0, cov-4.0.0, xdist-3.1.0
collected 6 items / 1 skipped

tests/unit/test_version.py . [ 16%]
tests/unit/examples/test_building_deploying_multi_stage_RecSys.py F [ 33%]
tests/unit/examples/test_scaling_criteo_merlin_models.py F [ 50%]
tests/unit/examples/test_scaling_criteo_optimize_notebook.py . [ 66%]
tests/unit/examples/test_z_legacy_notebooks.py .. [100%]

=================================== FAILURES ===================================
__________________________________ test_func ___________________________________

self = <testbook.client.TestbookNotebookClient object at 0x7f3ce0197520>
cell = [55], kwargs = {}, cell_indexes = [55], executed_cells = [], idx = 55

def execute_cell(self, cell, **kwargs) -> Union[Dict, List[Dict]]:
    """
    Executes a cell or list of cells
    """
    if isinstance(cell, slice):
        start, stop = self._cell_index(cell.start), self._cell_index(cell.stop)
        if cell.step is not None:
            raise TestbookError('testbook does not support step argument')

        cell = range(start, stop + 1)
    elif isinstance(cell, str) or isinstance(cell, int):
        cell = [cell]

    cell_indexes = cell

    if all(isinstance(x, str) for x in cell):
        cell_indexes = [self._cell_index(tag) for tag in cell]

    executed_cells = []
    for idx in cell_indexes:
        try:
          cell = super().execute_cell(self.nb['cells'][idx], idx, **kwargs)

../../../.local/lib/python3.8/site-packages/testbook/client.py:133:


args = (<testbook.client.TestbookNotebookClient object at 0x7f3ce0197520>, {'id': 'f81a37f4', 'cell_type': 'code', 'metadata'...erenceServerException\x1b[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found"]}]}, 55)
kwargs = {}

def wrapped(*args, **kwargs):
  return just_run(coro(*args, **kwargs))

../../../.local/lib/python3.8/site-packages/nbclient/util.py:84:


coro = <coroutine object NotebookClient.async_execute_cell at 0x7f3b6f24c2c0>

def just_run(coro: Awaitable) -> Any:
    """Make the coroutine run, even if there is an event loop running (using nest_asyncio)"""
    # original from vaex/asyncio.py
    loop = asyncio._get_running_loop()
    if loop is None:
        had_running_loop = False
        try:
            loop = asyncio.get_event_loop()
        except RuntimeError:
            # we can still get 'There is no current event loop in ...'
            loop = asyncio.new_event_loop()
            asyncio.set_event_loop(loop)
    else:
        had_running_loop = True
    if had_running_loop:
        # if there is a running loop, we patch using nest_asyncio
        # to have reentrant event loops
        check_ipython()
        import nest_asyncio

        nest_asyncio.apply()
        check_patch_tornado()
  return loop.run_until_complete(coro)

../../../.local/lib/python3.8/site-packages/nbclient/util.py:62:


self = <_UnixSelectorEventLoop running=False closed=False debug=False>
future = <Task finished name='Task-377' coro=<NotebookClient.async_execute_cell() done, defined at /var/jenkins_home/.local/lib...found\nInferenceServerException: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found\n')>

def run_until_complete(self, future):
    """Run until the Future is done.

    If the argument is a coroutine, it is wrapped in a Task.

    WARNING: It would be disastrous to call run_until_complete()
    with the same coroutine twice -- it would wrap it in two
    different Tasks and that can't be good.

    Return the Future's result, or raise its exception.
    """
    self._check_closed()
    self._check_running()

    new_task = not futures.isfuture(future)
    future = tasks.ensure_future(future, loop=self)
    if new_task:
        # An exception is raised if the future didn't complete, so there
        # is no need to log the "destroy pending task" message
        future._log_destroy_pending = False

    future.add_done_callback(_run_until_complete_cb)
    try:
        self.run_forever()
    except:
        if new_task and future.done() and not future.cancelled():
            # The coroutine raised a BaseException. Consume the exception
            # to not log a warning, the caller doesn't have access to the
            # local task.
            future.exception()
        raise
    finally:
        future.remove_done_callback(_run_until_complete_cb)
    if not future.done():
        raise RuntimeError('Event loop stopped before Future completed.')
  return future.result()

/usr/lib/python3.8/asyncio/base_events.py:616:


self = <testbook.client.TestbookNotebookClient object at 0x7f3ce0197520>
cell = {'id': 'f81a37f4', 'cell_type': 'code', 'metadata': {'execution': {'iopub.status.busy': '2022-12-20T19:05:54.631364Z',...1mInferenceServerException\x1b[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found"]}]}
cell_index = 55, execution_count = None, store_history = True

async def async_execute_cell(
    self,
    cell: NotebookNode,
    cell_index: int,
    execution_count: t.Optional[int] = None,
    store_history: bool = True,
) -> NotebookNode:
    """
    Executes a single code cell.

    To execute all cells see :meth:`execute`.

    Parameters
    ----------
    cell : nbformat.NotebookNode
        The cell which is currently being processed.
    cell_index : int
        The position of the cell within the notebook object.
    execution_count : int
        The execution count to be assigned to the cell (default: Use kernel response)
    store_history : bool
        Determines if history should be stored in the kernel (default: False).
        Specific to ipython kernels, which can store command histories.

    Returns
    -------
    output : dict
        The execution output payload (or None for no output).

    Raises
    ------
    CellExecutionError
        If execution failed and should raise an exception, this will be raised
        with defaults about the failure.

    Returns
    -------
    cell : NotebookNode
        The cell which was just processed.
    """
    assert self.kc is not None

    await run_hook(self.on_cell_start, cell=cell, cell_index=cell_index)

    if cell.cell_type != 'code' or not cell.source.strip():
        self.log.debug("Skipping non-executing cell %s", cell_index)
        return cell

    if self.skip_cells_with_tag in cell.metadata.get("tags", []):
        self.log.debug("Skipping tagged cell %s", cell_index)
        return cell

    if self.record_timing:  # clear execution metadata prior to execution
        cell['metadata']['execution'] = {}

    self.log.debug("Executing cell:\n%s", cell.source)

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors or "raises-exception" in cell.metadata.get("tags", [])
    )

    await run_hook(self.on_cell_execute, cell=cell, cell_index=cell_index)
    parent_msg_id = await ensure_async(
        self.kc.execute(
            cell.source, store_history=store_history, stop_on_error=not cell_allows_errors
        )
    )
    await run_hook(self.on_cell_complete, cell=cell, cell_index=cell_index)
    # We launched a code cell to execute
    self.code_cells_executed += 1
    exec_timeout = self._get_timeout(cell)

    cell.outputs = []
    self.clear_before_next_output = False

    task_poll_kernel_alive = asyncio.ensure_future(self._async_poll_kernel_alive())
    task_poll_output_msg = asyncio.ensure_future(
        self._async_poll_output_msg(parent_msg_id, cell, cell_index)
    )
    self.task_poll_for_reply = asyncio.ensure_future(
        self._async_poll_for_reply(
            parent_msg_id, cell, exec_timeout, task_poll_output_msg, task_poll_kernel_alive
        )
    )
    try:
        exec_reply = await self.task_poll_for_reply
    except asyncio.CancelledError:
        # can only be cancelled by task_poll_kernel_alive when the kernel is dead
        task_poll_output_msg.cancel()
        raise DeadKernelError("Kernel died")
    except Exception as e:
        # Best effort to cancel request if it hasn't been resolved
        try:
            # Check if the task_poll_output is doing the raising for us
            if not isinstance(e, CellControlSignal):
                task_poll_output_msg.cancel()
        finally:
            raise

    if execution_count:
        cell['execution_count'] = execution_count
  await self._check_raise_for_error(cell, cell_index, exec_reply)

../../../.local/lib/python3.8/site-packages/nbclient/client.py:965:


self = <testbook.client.TestbookNotebookClient object at 0x7f3ce0197520>
cell = {'id': 'f81a37f4', 'cell_type': 'code', 'metadata': {'execution': {'iopub.status.busy': '2022-12-20T19:05:54.631364Z',...1mInferenceServerException\x1b[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found"]}]}
cell_index = 55
exec_reply = {'buffers': [], 'content': {'ename': 'InferenceServerException', 'engine_info': {'engine_id': -1, 'engine_uuid': '947a...e, 'engine': '947aec7c-e687-43ed-8c4a-7539d5fb0c4f', 'started': '2022-12-20T19:05:54.631728Z', 'status': 'error'}, ...}

async def _check_raise_for_error(
    self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
) -> None:

    if exec_reply is None:
        return None

    exec_reply_content = exec_reply['content']
    if exec_reply_content['status'] != 'error':
        return None

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors
        or exec_reply_content.get('ename') in self.allow_error_names
        or "raises-exception" in cell.metadata.get("tags", [])
    )
    await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
    if not cell_allows_errors:
      raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)

E nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E
E import shutil
E from merlin.core.dispatch import get_lib
E from merlin.dataloader.tf_utils import configure_tensorflow
E configure_tensorflow()
E df_lib = get_lib()
E batch = df_lib.read_parquet(
E os.path.join("/tmp/data/processed_nvt/", "train", "part_0.parquet"),
E num_rows=1,
E columns=["user_id_raw"],
E )
E from merlin.systems.triton.utils import run_ensemble_on_tritonserver
E response = run_ensemble_on_tritonserver(
E "/tmp/examples/poc_ensemble", ensemble.graph.input_schema, batch, outputs, "ensemble_model"
E )
E response = [x.tolist()[0] for x in response["ordered_ids"]]
E shutil.rmtree("/tmp/examples/", ignore_errors=True)
E
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mInferenceServerException�[0m Traceback (most recent call last)
E Cell �[0;32mIn [32], line 12�[0m
E �[1;32m 6�[0m batch �[38;5;241m=�[39m df_lib�[38;5;241m.�[39mread_parquet(
E �[1;32m 7�[0m os�[38;5;241m.�[39mpath�[38;5;241m.�[39mjoin(�[38;5;124m"�[39m�[38;5;124m/tmp/data/processed_nvt/�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mtrain�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mpart_0.parquet�[39m�[38;5;124m"�[39m),
E �[1;32m 8�[0m num_rows�[38;5;241m=�[39m�[38;5;241m1�[39m,
E �[1;32m 9�[0m columns�[38;5;241m=�[39m[�[38;5;124m"�[39m�[38;5;124muser_id_raw�[39m�[38;5;124m"�[39m],
E �[1;32m 10�[0m )
E �[1;32m 11�[0m �[38;5;28;01mfrom�[39;00m �[38;5;21;01mmerlin�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01msystems�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mtriton�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mutils�[39;00m �[38;5;28;01mimport�[39;00m run_ensemble_on_tritonserver
E �[0;32m---> 12�[0m response �[38;5;241m=�[39m �[43mrun_ensemble_on_tritonserver�[49m�[43m(�[49m
E �[1;32m 13�[0m �[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43m/tmp/examples/poc_ensemble�[39;49m�[38;5;124;43m"�[39;49m�[43m,�[49m�[43m �[49m�[43mensemble�[49m�[38;5;241;43m.�[39;49m�[43mgraph�[49m�[38;5;241;43m.�[39;49m�[43minput_schema�[49m�[43m,�[49m�[43m �[49m�[43mbatch�[49m�[43m,�[49m�[43m �[49m�[43moutputs�[49m�[43m,�[49m�[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43mensemble_model�[39;49m�[38;5;124;43m"�[39;49m
E �[1;32m 14�[0m �[43m)�[49m
E �[1;32m 15�[0m response �[38;5;241m=�[39m [x�[38;5;241m.�[39mtolist()[�[38;5;241m0�[39m] �[38;5;28;01mfor�[39;00m x �[38;5;129;01min�[39;00m response[�[38;5;124m"�[39m�[38;5;124mordered_ids�[39m�[38;5;124m"�[39m]]
E �[1;32m 16�[0m shutil�[38;5;241m.�[39mrmtree(�[38;5;124m"�[39m�[38;5;124m/tmp/examples/�[39m�[38;5;124m"�[39m, ignore_errors�[38;5;241m=�[39m�[38;5;28;01mTrue�[39;00m)
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:140�[0m, in �[0;36mrun_ensemble_on_tritonserver�[0;34m(tmpdir, schema, df, output_columns, model_name)�[0m
E �[1;32m 138�[0m response �[38;5;241m=�[39m �[38;5;28;01mNone�[39;00m
E �[1;32m 139�[0m �[38;5;28;01mwith�[39;00m run_triton_server(tmpdir) �[38;5;28;01mas�[39;00m client:
E �[0;32m--> 140�[0m response �[38;5;241m=�[39m �[43msend_triton_request�[49m�[43m(�[49m
E �[1;32m 141�[0m �[43m �[49m�[43mschema�[49m�[43m,�[49m�[43m �[49m�[43mdf�[49m�[43m,�[49m�[43m �[49m�[43moutput_columns�[49m�[43m,�[49m�[43m �[49m�[43mclient�[49m�[38;5;241;43m=�[39;49m�[43mclient�[49m�[43m,�[49m�[43m �[49m�[43mtriton_model�[49m�[38;5;241;43m=�[39;49m�[43mmodel_name�[49m
E �[1;32m 142�[0m �[43m �[49m�[43m)�[49m
E �[1;32m 144�[0m �[38;5;28;01mreturn�[39;00m response
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:193�[0m, in �[0;36msend_triton_request�[0;34m(schema, df, outputs_list, client, endpoint, request_id, triton_model)�[0m
E �[1;32m 191�[0m outputs �[38;5;241m=�[39m [grpcclient�[38;5;241m.�[39mInferRequestedOutput(col) �[38;5;28;01mfor�[39;00m col �[38;5;129;01min�[39;00m outputs_list]
E �[1;32m 192�[0m �[38;5;28;01mwith�[39;00m client:
E �[0;32m--> 193�[0m response �[38;5;241m=�[39m �[43mclient�[49m�[38;5;241;43m.�[39;49m�[43minfer�[49m�[43m(�[49m�[43mtriton_model�[49m�[43m,�[49m�[43m �[49m�[43minputs�[49m�[43m,�[49m�[43m �[49m�[43mrequest_id�[49m�[38;5;241;43m=�[39;49m�[43mrequest_id�[49m�[43m,�[49m�[43m �[49m�[43moutputs�[49m�[38;5;241;43m=�[39;49m�[43moutputs�[49m�[43m)�[49m
E �[1;32m 195�[0m results �[38;5;241m=�[39m {}
E �[1;32m 196�[0m �[38;5;28;01mfor�[39;00m col �[38;5;129;01min�[39;00m outputs_list:
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/tritonclient/grpc/init.py:1322�[0m, in �[0;36mInferenceServerClient.infer�[0;34m(self, model_name, inputs, model_version, outputs, request_id, sequence_id, sequence_start, sequence_end, priority, timeout, client_timeout, headers, compression_algorithm)�[0m
E �[1;32m 1320�[0m �[38;5;28;01mreturn�[39;00m result
E �[1;32m 1321�[0m �[38;5;28;01mexcept�[39;00m grpc�[38;5;241m.�[39mRpcError �[38;5;28;01mas�[39;00m rpc_error:
E �[0;32m-> 1322�[0m �[43mraise_error_grpc�[49m�[43m(�[49m�[43mrpc_error�[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/tritonclient/grpc/init.py:62�[0m, in �[0;36mraise_error_grpc�[0;34m(rpc_error)�[0m
E �[1;32m 61�[0m �[38;5;28;01mdef�[39;00m �[38;5;21mraise_error_grpc�[39m(rpc_error):
E �[0;32m---> 62�[0m �[38;5;28;01mraise�[39;00m get_error_grpc(rpc_error) �[38;5;28;01mfrom�[39;00m �[38;5;28mNone�[39m
E
E �[0;31mInferenceServerException�[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found
E InferenceServerException: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found

../../../.local/lib/python3.8/site-packages/nbclient/client.py:862: CellExecutionError

During handling of the above exception, another exception occurred:

def test_func():
    with testbook(
        REPO_ROOT
        / "examples"
        / "Building-and-deploying-multi-stage-RecSys"
        / "01-Building-Recommender-Systems-with-Merlin.ipynb",
        execute=False,
    ) as tb1:
        tb1.inject(
            """
            import os
            os.environ["DATA_FOLDER"] = "/tmp/data/"
            os.environ["NUM_ROWS"] = "100000"
            os.system("mkdir -p /tmp/examples")
            os.environ["BASE_DIR"] = "/tmp/examples/"
            """
        )
        tb1.execute()
        assert os.path.isdir("/tmp/examples/dlrm")
        assert os.path.isdir("/tmp/examples/feature_repo")
        assert os.path.isdir("/tmp/examples/query_tower")
        assert os.path.isfile("/tmp/examples/item_embeddings.parquet")
        assert os.path.isfile("/tmp/examples/feature_repo/user_features.py")
        assert os.path.isfile("/tmp/examples/feature_repo/item_features.py")

    with testbook(
        REPO_ROOT
        / "examples"
        / "Building-and-deploying-multi-stage-RecSys"
        / "02-Deploying-multi-stage-RecSys-with-Merlin-Systems.ipynb",
        execute=False,
        timeout=180,
    ) as tb2:
        tb2.inject(
            """
            import os
            os.environ["DATA_FOLDER"] = "/tmp/data/"
            os.environ["BASE_DIR"] = "/tmp/examples/"
            os.environ["topk_retrieval"] = "20"
            """
        )
        NUM_OF_CELLS = len(tb2.cells)
        tb2.execute_cell(list(range(0, NUM_OF_CELLS - 3)))
        top_k = tb2.ref("top_k")
        outputs = tb2.ref("outputs")
        assert outputs[0] == "ordered_ids"
      tb2.inject(
            """
            import shutil
            from merlin.core.dispatch import get_lib
            from merlin.dataloader.tf_utils import configure_tensorflow
            configure_tensorflow()
            df_lib = get_lib()
            batch = df_lib.read_parquet(
                os.path.join("/tmp/data/processed_nvt/", "train", "part_0.parquet"),
                num_rows=1,
                columns=["user_id_raw"],
            )
            from merlin.systems.triton.utils import run_ensemble_on_tritonserver
            response = run_ensemble_on_tritonserver(
                "/tmp/examples/poc_ensemble", ensemble.graph.input_schema, batch, outputs,  "ensemble_model"
            )
            response = [x.tolist()[0] for x in response["ordered_ids"]]
            shutil.rmtree("/tmp/examples/", ignore_errors=True)
            """
        )

tests/unit/examples/test_building_deploying_multi_stage_RecSys.py:61:


../../../.local/lib/python3.8/site-packages/testbook/client.py:237: in inject
cell = TestbookNode(self.execute_cell(inject_idx)) if run else TestbookNode(code_cell)


self = <testbook.client.TestbookNotebookClient object at 0x7f3ce0197520>
cell = [55], kwargs = {}, cell_indexes = [55], executed_cells = [], idx = 55

def execute_cell(self, cell, **kwargs) -> Union[Dict, List[Dict]]:
    """
    Executes a cell or list of cells
    """
    if isinstance(cell, slice):
        start, stop = self._cell_index(cell.start), self._cell_index(cell.stop)
        if cell.step is not None:
            raise TestbookError('testbook does not support step argument')

        cell = range(start, stop + 1)
    elif isinstance(cell, str) or isinstance(cell, int):
        cell = [cell]

    cell_indexes = cell

    if all(isinstance(x, str) for x in cell):
        cell_indexes = [self._cell_index(tag) for tag in cell]

    executed_cells = []
    for idx in cell_indexes:
        try:
            cell = super().execute_cell(self.nb['cells'][idx], idx, **kwargs)
        except CellExecutionError as ce:
          raise TestbookRuntimeError(ce.evalue, ce, self._get_error_class(ce.ename))

E testbook.exceptions.TestbookRuntimeError: An error occurred while executing the following cell:
E ------------------
E
E import shutil
E from merlin.core.dispatch import get_lib
E from merlin.dataloader.tf_utils import configure_tensorflow
E configure_tensorflow()
E df_lib = get_lib()
E batch = df_lib.read_parquet(
E os.path.join("/tmp/data/processed_nvt/", "train", "part_0.parquet"),
E num_rows=1,
E columns=["user_id_raw"],
E )
E from merlin.systems.triton.utils import run_ensemble_on_tritonserver
E response = run_ensemble_on_tritonserver(
E "/tmp/examples/poc_ensemble", ensemble.graph.input_schema, batch, outputs, "ensemble_model"
E )
E response = [x.tolist()[0] for x in response["ordered_ids"]]
E shutil.rmtree("/tmp/examples/", ignore_errors=True)
E
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mInferenceServerException�[0m Traceback (most recent call last)
E Cell �[0;32mIn [32], line 12�[0m
E �[1;32m 6�[0m batch �[38;5;241m=�[39m df_lib�[38;5;241m.�[39mread_parquet(
E �[1;32m 7�[0m os�[38;5;241m.�[39mpath�[38;5;241m.�[39mjoin(�[38;5;124m"�[39m�[38;5;124m/tmp/data/processed_nvt/�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mtrain�[39m�[38;5;124m"�[39m, �[38;5;124m"�[39m�[38;5;124mpart_0.parquet�[39m�[38;5;124m"�[39m),
E �[1;32m 8�[0m num_rows�[38;5;241m=�[39m�[38;5;241m1�[39m,
E �[1;32m 9�[0m columns�[38;5;241m=�[39m[�[38;5;124m"�[39m�[38;5;124muser_id_raw�[39m�[38;5;124m"�[39m],
E �[1;32m 10�[0m )
E �[1;32m 11�[0m �[38;5;28;01mfrom�[39;00m �[38;5;21;01mmerlin�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01msystems�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mtriton�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mutils�[39;00m �[38;5;28;01mimport�[39;00m run_ensemble_on_tritonserver
E �[0;32m---> 12�[0m response �[38;5;241m=�[39m �[43mrun_ensemble_on_tritonserver�[49m�[43m(�[49m
E �[1;32m 13�[0m �[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43m/tmp/examples/poc_ensemble�[39;49m�[38;5;124;43m"�[39;49m�[43m,�[49m�[43m �[49m�[43mensemble�[49m�[38;5;241;43m.�[39;49m�[43mgraph�[49m�[38;5;241;43m.�[39;49m�[43minput_schema�[49m�[43m,�[49m�[43m �[49m�[43mbatch�[49m�[43m,�[49m�[43m �[49m�[43moutputs�[49m�[43m,�[49m�[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43mensemble_model�[39;49m�[38;5;124;43m"�[39;49m
E �[1;32m 14�[0m �[43m)�[49m
E �[1;32m 15�[0m response �[38;5;241m=�[39m [x�[38;5;241m.�[39mtolist()[�[38;5;241m0�[39m] �[38;5;28;01mfor�[39;00m x �[38;5;129;01min�[39;00m response[�[38;5;124m"�[39m�[38;5;124mordered_ids�[39m�[38;5;124m"�[39m]]
E �[1;32m 16�[0m shutil�[38;5;241m.�[39mrmtree(�[38;5;124m"�[39m�[38;5;124m/tmp/examples/�[39m�[38;5;124m"�[39m, ignore_errors�[38;5;241m=�[39m�[38;5;28;01mTrue�[39;00m)
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:140�[0m, in �[0;36mrun_ensemble_on_tritonserver�[0;34m(tmpdir, schema, df, output_columns, model_name)�[0m
E �[1;32m 138�[0m response �[38;5;241m=�[39m �[38;5;28;01mNone�[39;00m
E �[1;32m 139�[0m �[38;5;28;01mwith�[39;00m run_triton_server(tmpdir) �[38;5;28;01mas�[39;00m client:
E �[0;32m--> 140�[0m response �[38;5;241m=�[39m �[43msend_triton_request�[49m�[43m(�[49m
E �[1;32m 141�[0m �[43m �[49m�[43mschema�[49m�[43m,�[49m�[43m �[49m�[43mdf�[49m�[43m,�[49m�[43m �[49m�[43moutput_columns�[49m�[43m,�[49m�[43m �[49m�[43mclient�[49m�[38;5;241;43m=�[39;49m�[43mclient�[49m�[43m,�[49m�[43m �[49m�[43mtriton_model�[49m�[38;5;241;43m=�[39;49m�[43mmodel_name�[49m
E �[1;32m 142�[0m �[43m �[49m�[43m)�[49m
E �[1;32m 144�[0m �[38;5;28;01mreturn�[39;00m response
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:193�[0m, in �[0;36msend_triton_request�[0;34m(schema, df, outputs_list, client, endpoint, request_id, triton_model)�[0m
E �[1;32m 191�[0m outputs �[38;5;241m=�[39m [grpcclient�[38;5;241m.�[39mInferRequestedOutput(col) �[38;5;28;01mfor�[39;00m col �[38;5;129;01min�[39;00m outputs_list]
E �[1;32m 192�[0m �[38;5;28;01mwith�[39;00m client:
E �[0;32m--> 193�[0m response �[38;5;241m=�[39m �[43mclient�[49m�[38;5;241;43m.�[39;49m�[43minfer�[49m�[43m(�[49m�[43mtriton_model�[49m�[43m,�[49m�[43m �[49m�[43minputs�[49m�[43m,�[49m�[43m �[49m�[43mrequest_id�[49m�[38;5;241;43m=�[39;49m�[43mrequest_id�[49m�[43m,�[49m�[43m �[49m�[43moutputs�[49m�[38;5;241;43m=�[39;49m�[43moutputs�[49m�[43m)�[49m
E �[1;32m 195�[0m results �[38;5;241m=�[39m {}
E �[1;32m 196�[0m �[38;5;28;01mfor�[39;00m col �[38;5;129;01min�[39;00m outputs_list:
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/tritonclient/grpc/init.py:1322�[0m, in �[0;36mInferenceServerClient.infer�[0;34m(self, model_name, inputs, model_version, outputs, request_id, sequence_id, sequence_start, sequence_end, priority, timeout, client_timeout, headers, compression_algorithm)�[0m
E �[1;32m 1320�[0m �[38;5;28;01mreturn�[39;00m result
E �[1;32m 1321�[0m �[38;5;28;01mexcept�[39;00m grpc�[38;5;241m.�[39mRpcError �[38;5;28;01mas�[39;00m rpc_error:
E �[0;32m-> 1322�[0m �[43mraise_error_grpc�[49m�[43m(�[49m�[43mrpc_error�[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/tritonclient/grpc/init.py:62�[0m, in �[0;36mraise_error_grpc�[0;34m(rpc_error)�[0m
E �[1;32m 61�[0m �[38;5;28;01mdef�[39;00m �[38;5;21mraise_error_grpc�[39m(rpc_error):
E �[0;32m---> 62�[0m �[38;5;28;01mraise�[39;00m get_error_grpc(rpc_error) �[38;5;28;01mfrom�[39;00m �[38;5;28mNone�[39m
E
E �[0;31mInferenceServerException�[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found
E InferenceServerException: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found

../../../.local/lib/python3.8/site-packages/testbook/client.py:135: TestbookRuntimeError
----------------------------- Captured stdout call -----------------------------
Signal (2) received.
----------------------------- Captured stderr call -----------------------------
2022-12-20 19:04:02.731749: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 19:04:06.760514: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 19:04:06.760615: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:04:06.761390: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 1
2022-12-20 19:04:06.761447: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-12-20 19:04:06.762011: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 2
2022-12-20 19:04:06.762058: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-12-20 19:04:06.762634: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 3
2022-12-20 19:04:06.762680: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/usr/lib/python3.8/logging/init.py", line 2127, in shutdown
h.close()
File "/usr/local/lib/python3.8/dist-packages/absl/logging/init.py", line 934, in close
self.stream.close()
File "/usr/local/lib/python3.8/dist-packages/ipykernel/iostream.py", line 438, in close
self.watch_fd_thread.join()
AttributeError: 'OutStream' object has no attribute 'watch_fd_thread'
WARNING clustering 457 points to 32 centroids: please provide at least 1248 training points
2022-12-20 19:05:32.969689: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 19:05:37.003399: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 19:05:37.003590: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:05:37.004332: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 1
2022-12-20 19:05:37.004390: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-12-20 19:05:37.004988: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 2
2022-12-20 19:05:37.005038: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-12-20 19:05:37.005626: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 3
2022-12-20 19:05:37.005701: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
I1220 19:05:54.962629 14997 pinned_memory_manager.cc:240] Pinned memory pool is created at '0x7fa1c6000000' with size 268435456
I1220 19:05:54.963366 14997 cuda_memory_manager.cc:105] CUDA memory pool is created on device 0 with size 536870912
I1220 19:05:54.968450 14997 model_lifecycle.cc:459] loading: 0_predicttensorflowtriton:1
I1220 19:05:54.968494 14997 model_lifecycle.cc:459] loading: 2_predicttensorflowtriton:1
I1220 19:05:54.968520 14997 model_lifecycle.cc:459] loading: executor_model:1
I1220 19:05:55.249121 14997 tensorflow.cc:2536] TRITONBACKEND_Initialize: tensorflow
I1220 19:05:55.249163 14997 tensorflow.cc:2546] Triton TRITONBACKEND API version: 1.10
I1220 19:05:55.249171 14997 tensorflow.cc:2552] 'tensorflow' TRITONBACKEND API version: 1.10
I1220 19:05:55.249177 14997 tensorflow.cc:2576] backend configuration:
{"cmdline":{"auto-complete-config":"true","backend-directory":"/opt/tritonserver/backends","min-compute-capability":"6.000000","version":"2","default-max-batch-size":"4"}}
I1220 19:05:55.249220 14997 tensorflow.cc:2642] TRITONBACKEND_ModelInitialize: 0_predicttensorflowtriton (version 1)
2022-12-20 19:05:55.249860: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:55.254305: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 19:05:55.254334: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:55.254419: I tensorflow/core/platform/cpu_feature_guard.cc:194] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: SSE3 SSE4.1 SSE4.2 AVX
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 19:05:55.668106: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 19:05:55.668226: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13788 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:05:55.708287: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:354] MLIR V1 optimization pass is not enabled
2022-12-20 19:05:55.710374: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 19:05:55.764293: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:55.788532: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 538683 microseconds.
I1220 19:05:55.794928 14997 tensorflow.cc:2642] TRITONBACKEND_ModelInitialize: 2_predicttensorflowtriton (version 1)
2022-12-20 19:05:55.795599: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:55.807420: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 19:05:55.807461: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:55.809724: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13788 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:05:55.840573: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 19:05:55.981996: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:56.034238: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 238642 microseconds.
I1220 19:05:56.054847 14997 tensorflow.cc:2691] TRITONBACKEND_ModelInstanceInitialize: 2_predicttensorflowtriton (GPU device 0)
2022-12-20 19:05:56.055355: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:56.063232: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 19:05:56.063261: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:56.065241: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13788 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:05:56.084625: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 19:05:56.247498: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/examples/poc_ensemble/2_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:56.305524: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 250174 microseconds.
I1220 19:05:56.305648 14997 tensorflow.cc:2691] TRITONBACKEND_ModelInstanceInitialize: 0_predicttensorflowtriton (GPU device 0)
I1220 19:05:56.305866 14997 model_lifecycle.cc:693] successfully loaded '2_predicttensorflowtriton' version 1
2022-12-20 19:05:56.306212: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:56.311793: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 19:05:56.311833: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:56.314653: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13788 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:05:56.324391: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 19:05:56.373963: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/examples/poc_ensemble/0_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:05:56.397115: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 90906 microseconds.
I1220 19:05:56.397420 14997 model_lifecycle.cc:693] successfully loaded '0_predicttensorflowtriton' version 1
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
I1220 19:06:05.229298 14997 python_be.cc:1767] TRITONBACKEND_ModelInstanceInitialize: executor_model (GPU device 0)
2022-12-20 19:06:12.846064: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 19:06:14.979406: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 19:06:14.979604: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
WARNING:tensorflow:No training configuration found in save file, so the model was not compiled. Compile it manually.
12/20/2022 07:06:17 PM WARNING:No training configuration found in save file, so the model was not compiled. Compile it manually.
/usr/local/lib/python3.8/dist-packages/faiss/loader.py:28: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
if LooseVersion(numpy.version) >= "1.19":
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)
12/20/2022 07:06:17 PM INFO:Loading faiss with AVX2 support.
12/20/2022 07:06:17 PM INFO:Could not load library with AVX2 support due to:
ModuleNotFoundError("No module named 'faiss.swigfaiss_avx2'")
12/20/2022 07:06:17 PM INFO:Loading faiss.
12/20/2022 07:06:17 PM INFO:Successfully loaded faiss.
I1220 19:06:24.066961 14997 model_lifecycle.cc:693] successfully loaded 'executor_model' version 1
I1220 19:06:24.067118 14997 server.cc:561]
+------------------+------+
| Repository Agent | Path |
+------------------+------+
+------------------+------+

I1220 19:06:24.067219 14997 server.cc:588]
+------------+-----------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Backend | Path | Config |
+------------+-----------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| tensorflow | /opt/tritonserver/backends/tensorflow2/libtriton_tensorflow2.so | {"cmdline":{"auto-complete-config":"true","backend-directory":"/opt/tritonserver/backends","min-compute-capability":"6.000000","version":"2","default-max-batch-size":"4"}} |
| python | /opt/tritonserver/backends/python/libtriton_python.so | {"cmdline":{"auto-complete-config":"true","min-compute-capability":"6.000000","backend-directory":"/opt/tritonserver/backends","default-max-batch-size":"4"}} |
+------------+-----------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I1220 19:06:24.067277 14997 server.cc:631]
+---------------------------+---------+--------+
| Model | Version | Status |
+---------------------------+---------+--------+
| 0_predicttensorflowtriton | 1 | READY |
| 2_predicttensorflowtriton | 1 | READY |
| executor_model | 1 | READY |
+---------------------------+---------+--------+

I1220 19:06:24.131296 14997 metrics.cc:650] Collecting metrics for GPU 0: Tesla P100-DGXS-16GB
I1220 19:06:24.132174 14997 tritonserver.cc:2214]
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Option | Value |
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| server_id | triton |
| server_version | 2.25.0 |
| server_extensions | classification sequence model_repository model_repository(unload_dependents) schedule_policy model_configuration system_shared_memory cuda_shared_memory binary_tensor_data statistics trace |
| model_repository_path[0] | /tmp/examples/poc_ensemble |
| model_control_mode | MODE_NONE |
| strict_model_config | 0 |
| rate_limit | OFF |
| pinned_memory_pool_byte_size | 268435456 |
| cuda_memory_pool_byte_size{0} | 536870912 |
| response_cache_byte_size | 0 |
| min_supported_compute_capability | 6.0 |
| strict_readiness | 1 |
| exit_timeout | 30 |
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I1220 19:06:24.134718 14997 grpc_server.cc:4610] Started GRPCInferenceService at localhost:8001
I1220 19:06:24.134920 14997 http_server.cc:3316] Started HTTPService at 0.0.0.0:8000
I1220 19:06:24.175822 14997 http_server.cc:178] Started Metrics Service at 0.0.0.0:8002
W1220 19:06:25.156201 14997 metrics.cc:468] Unable to get energy consumption for GPU 0. Status:Success, value:0
W1220 19:06:26.156413 14997 metrics.cc:468] Unable to get energy consumption for GPU 0. Status:Success, value:0
W1220 19:06:27.179593 14997 metrics.cc:468] Unable to get energy consumption for GPU 0. Status:Success, value:0
I1220 19:06:40.843254 14997 server.cc:262] Waiting for in-flight requests to complete.
I1220 19:06:40.843298 14997 server.cc:278] Timeout 30: Found 0 model versions that have in-flight inferences
I1220 19:06:40.843635 14997 server.cc:293] All models are stopped, unloading models
I1220 19:06:40.843667 14997 server.cc:300] Timeout 30: Found 3 live models and 0 in-flight non-inference requests
I1220 19:06:40.843693 14997 tensorflow.cc:2729] TRITONBACKEND_ModelInstanceFinalize: delete instance state
I1220 19:06:40.843784 14997 tensorflow.cc:2729] TRITONBACKEND_ModelInstanceFinalize: delete instance state
I1220 19:06:40.844195 14997 tensorflow.cc:2668] TRITONBACKEND_ModelFinalize: delete model state
I1220 19:06:40.844225 14997 tensorflow.cc:2668] TRITONBACKEND_ModelFinalize: delete model state
I1220 19:06:40.861292 14997 model_lifecycle.cc:578] successfully unloaded '0_predicttensorflowtriton' version 1
I1220 19:06:40.870682 14997 model_lifecycle.cc:578] successfully unloaded '2_predicttensorflowtriton' version 1
I1220 19:06:41.843775 14997 server.cc:300] Timeout 29: Found 1 live models and 0 in-flight non-inference requests
I1220 19:06:42.843908 14997 server.cc:300] Timeout 28: Found 1 live models and 0 in-flight non-inference requests
I1220 19:06:43.305837 14997 model_lifecycle.cc:578] successfully unloaded 'executor_model' version 1
I1220 19:06:43.844059 14997 server.cc:300] Timeout 27: Found 0 live models and 0 in-flight non-inference requests
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/usr/lib/python3.8/logging/init.py", line 2127, in shutdown
h.close()
File "/usr/local/lib/python3.8/dist-packages/absl/logging/init.py", line 934, in close
self.stream.close()
File "/usr/local/lib/python3.8/dist-packages/ipykernel/iostream.py", line 438, in close
self.watch_fd_thread.join()
AttributeError: 'OutStream' object has no attribute 'watch_fd_thread'
__________________________________ test_func ___________________________________

self = <testbook.client.TestbookNotebookClient object at 0x7f3b6f30a610>
cell = [31], kwargs = {}, cell_indexes = [31], executed_cells = [], idx = 31

def execute_cell(self, cell, **kwargs) -> Union[Dict, List[Dict]]:
    """
    Executes a cell or list of cells
    """
    if isinstance(cell, slice):
        start, stop = self._cell_index(cell.start), self._cell_index(cell.stop)
        if cell.step is not None:
            raise TestbookError('testbook does not support step argument')

        cell = range(start, stop + 1)
    elif isinstance(cell, str) or isinstance(cell, int):
        cell = [cell]

    cell_indexes = cell

    if all(isinstance(x, str) for x in cell):
        cell_indexes = [self._cell_index(tag) for tag in cell]

    executed_cells = []
    for idx in cell_indexes:
        try:
          cell = super().execute_cell(self.nb['cells'][idx], idx, **kwargs)

../../../.local/lib/python3.8/site-packages/testbook/client.py:133:


args = (<testbook.client.TestbookNotebookClient object at 0x7f3b6f30a610>, {'id': '06d4cbc4', 'cell_type': 'code', 'metadata'...erenceServerException\x1b[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found"]}]}, 31)
kwargs = {}

def wrapped(*args, **kwargs):
  return just_run(coro(*args, **kwargs))

../../../.local/lib/python3.8/site-packages/nbclient/util.py:84:


coro = <coroutine object NotebookClient.async_execute_cell at 0x7f3cdffd8940>

def just_run(coro: Awaitable) -> Any:
    """Make the coroutine run, even if there is an event loop running (using nest_asyncio)"""
    # original from vaex/asyncio.py
    loop = asyncio._get_running_loop()
    if loop is None:
        had_running_loop = False
        try:
            loop = asyncio.get_event_loop()
        except RuntimeError:
            # we can still get 'There is no current event loop in ...'
            loop = asyncio.new_event_loop()
            asyncio.set_event_loop(loop)
    else:
        had_running_loop = True
    if had_running_loop:
        # if there is a running loop, we patch using nest_asyncio
        # to have reentrant event loops
        check_ipython()
        import nest_asyncio

        nest_asyncio.apply()
        check_patch_tornado()
  return loop.run_until_complete(coro)

../../../.local/lib/python3.8/site-packages/nbclient/util.py:62:


self = <_UnixSelectorEventLoop running=False closed=False debug=False>
future = <Task finished name='Task-612' coro=<NotebookClient.async_execute_cell() done, defined at /var/jenkins_home/.local/lib...found\nInferenceServerException: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found\n')>

def run_until_complete(self, future):
    """Run until the Future is done.

    If the argument is a coroutine, it is wrapped in a Task.

    WARNING: It would be disastrous to call run_until_complete()
    with the same coroutine twice -- it would wrap it in two
    different Tasks and that can't be good.

    Return the Future's result, or raise its exception.
    """
    self._check_closed()
    self._check_running()

    new_task = not futures.isfuture(future)
    future = tasks.ensure_future(future, loop=self)
    if new_task:
        # An exception is raised if the future didn't complete, so there
        # is no need to log the "destroy pending task" message
        future._log_destroy_pending = False

    future.add_done_callback(_run_until_complete_cb)
    try:
        self.run_forever()
    except:
        if new_task and future.done() and not future.cancelled():
            # The coroutine raised a BaseException. Consume the exception
            # to not log a warning, the caller doesn't have access to the
            # local task.
            future.exception()
        raise
    finally:
        future.remove_done_callback(_run_until_complete_cb)
    if not future.done():
        raise RuntimeError('Event loop stopped before Future completed.')
  return future.result()

/usr/lib/python3.8/asyncio/base_events.py:616:


self = <testbook.client.TestbookNotebookClient object at 0x7f3b6f30a610>
cell = {'id': '06d4cbc4', 'cell_type': 'code', 'metadata': {'execution': {'iopub.status.busy': '2022-12-20T19:09:51.091478Z',...1mInferenceServerException\x1b[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found"]}]}
cell_index = 31, execution_count = None, store_history = True

async def async_execute_cell(
    self,
    cell: NotebookNode,
    cell_index: int,
    execution_count: t.Optional[int] = None,
    store_history: bool = True,
) -> NotebookNode:
    """
    Executes a single code cell.

    To execute all cells see :meth:`execute`.

    Parameters
    ----------
    cell : nbformat.NotebookNode
        The cell which is currently being processed.
    cell_index : int
        The position of the cell within the notebook object.
    execution_count : int
        The execution count to be assigned to the cell (default: Use kernel response)
    store_history : bool
        Determines if history should be stored in the kernel (default: False).
        Specific to ipython kernels, which can store command histories.

    Returns
    -------
    output : dict
        The execution output payload (or None for no output).

    Raises
    ------
    CellExecutionError
        If execution failed and should raise an exception, this will be raised
        with defaults about the failure.

    Returns
    -------
    cell : NotebookNode
        The cell which was just processed.
    """
    assert self.kc is not None

    await run_hook(self.on_cell_start, cell=cell, cell_index=cell_index)

    if cell.cell_type != 'code' or not cell.source.strip():
        self.log.debug("Skipping non-executing cell %s", cell_index)
        return cell

    if self.skip_cells_with_tag in cell.metadata.get("tags", []):
        self.log.debug("Skipping tagged cell %s", cell_index)
        return cell

    if self.record_timing:  # clear execution metadata prior to execution
        cell['metadata']['execution'] = {}

    self.log.debug("Executing cell:\n%s", cell.source)

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors or "raises-exception" in cell.metadata.get("tags", [])
    )

    await run_hook(self.on_cell_execute, cell=cell, cell_index=cell_index)
    parent_msg_id = await ensure_async(
        self.kc.execute(
            cell.source, store_history=store_history, stop_on_error=not cell_allows_errors
        )
    )
    await run_hook(self.on_cell_complete, cell=cell, cell_index=cell_index)
    # We launched a code cell to execute
    self.code_cells_executed += 1
    exec_timeout = self._get_timeout(cell)

    cell.outputs = []
    self.clear_before_next_output = False

    task_poll_kernel_alive = asyncio.ensure_future(self._async_poll_kernel_alive())
    task_poll_output_msg = asyncio.ensure_future(
        self._async_poll_output_msg(parent_msg_id, cell, cell_index)
    )
    self.task_poll_for_reply = asyncio.ensure_future(
        self._async_poll_for_reply(
            parent_msg_id, cell, exec_timeout, task_poll_output_msg, task_poll_kernel_alive
        )
    )
    try:
        exec_reply = await self.task_poll_for_reply
    except asyncio.CancelledError:
        # can only be cancelled by task_poll_kernel_alive when the kernel is dead
        task_poll_output_msg.cancel()
        raise DeadKernelError("Kernel died")
    except Exception as e:
        # Best effort to cancel request if it hasn't been resolved
        try:
            # Check if the task_poll_output is doing the raising for us
            if not isinstance(e, CellControlSignal):
                task_poll_output_msg.cancel()
        finally:
            raise

    if execution_count:
        cell['execution_count'] = execution_count
  await self._check_raise_for_error(cell, cell_index, exec_reply)

../../../.local/lib/python3.8/site-packages/nbclient/client.py:965:


self = <testbook.client.TestbookNotebookClient object at 0x7f3b6f30a610>
cell = {'id': '06d4cbc4', 'cell_type': 'code', 'metadata': {'execution': {'iopub.status.busy': '2022-12-20T19:09:51.091478Z',...1mInferenceServerException\x1b[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found"]}]}
cell_index = 31
exec_reply = {'buffers': [], 'content': {'ename': 'InferenceServerException', 'engine_info': {'engine_id': -1, 'engine_uuid': '940d...e, 'engine': '940d40f7-8ad1-482b-8e7c-1395dd84810d', 'started': '2022-12-20T19:09:51.091762Z', 'status': 'error'}, ...}

async def _check_raise_for_error(
    self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
) -> None:

    if exec_reply is None:
        return None

    exec_reply_content = exec_reply['content']
    if exec_reply_content['status'] != 'error':
        return None

    cell_allows_errors = (not self.force_raise_errors) and (
        self.allow_errors
        or exec_reply_content.get('ename') in self.allow_error_names
        or "raises-exception" in cell.metadata.get("tags", [])
    )
    await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
    if not cell_allows_errors:
      raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)

E nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E ------------------
E
E import shutil
E from merlin.systems.triton.utils import run_ensemble_on_tritonserver
E outputs = ensemble.output_schema.column_names
E response = run_ensemble_on_tritonserver(
E "/tmp/output/criteo/ensemble/",workflow.input_schema, batch.fillna(0),
E outputs, "ensemble_model"
E )
E response = [x.tolist()[0] for x in response["label/binary_classification_task"]]
E shutil.rmtree("/tmp/input/criteo", ignore_errors=True)
E shutil.rmtree("/tmp/output/criteo", ignore_errors=True)
E
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mInferenceServerException�[0m Traceback (most recent call last)
E Cell �[0;32mIn [13], line 4�[0m
E �[1;32m 2�[0m �[38;5;28;01mfrom�[39;00m �[38;5;21;01mmerlin�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01msystems�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mtriton�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mutils�[39;00m �[38;5;28;01mimport�[39;00m run_ensemble_on_tritonserver
E �[1;32m 3�[0m outputs �[38;5;241m=�[39m ensemble�[38;5;241m.�[39moutput_schema�[38;5;241m.�[39mcolumn_names
E �[0;32m----> 4�[0m response �[38;5;241m=�[39m �[43mrun_ensemble_on_tritonserver�[49m�[43m(�[49m
E �[1;32m 5�[0m �[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43m/tmp/output/criteo/ensemble/�[39;49m�[38;5;124;43m"�[39;49m�[43m,�[49m�[43mworkflow�[49m�[38;5;241;43m.�[39;49m�[43minput_schema�[49m�[43m,�[49m�[43m �[49m�[43mbatch�[49m�[38;5;241;43m.�[39;49m�[43mfillna�[49m�[43m(�[49m�[38;5;241;43m0�[39;49m�[43m)�[49m�[43m,�[49m
E �[1;32m 6�[0m �[43m �[49m�[43moutputs�[49m�[43m,�[49m�[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43mensemble_model�[39;49m�[38;5;124;43m"�[39;49m
E �[1;32m 7�[0m �[43m)�[49m
E �[1;32m 8�[0m response �[38;5;241m=�[39m [x�[38;5;241m.�[39mtolist()[�[38;5;241m0�[39m] �[38;5;28;01mfor�[39;00m x �[38;5;129;01min�[39;00m response[�[38;5;124m"�[39m�[38;5;124mlabel/binary_classification_task�[39m�[38;5;124m"�[39m]]
E �[1;32m 9�[0m shutil�[38;5;241m.�[39mrmtree(�[38;5;124m"�[39m�[38;5;124m/tmp/input/criteo�[39m�[38;5;124m"�[39m, ignore_errors�[38;5;241m=�[39m�[38;5;28;01mTrue�[39;00m)
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:140�[0m, in �[0;36mrun_ensemble_on_tritonserver�[0;34m(tmpdir, schema, df, output_columns, model_name)�[0m
E �[1;32m 138�[0m response �[38;5;241m=�[39m �[38;5;28;01mNone�[39;00m
E �[1;32m 139�[0m �[38;5;28;01mwith�[39;00m run_triton_server(tmpdir) �[38;5;28;01mas�[39;00m client:
E �[0;32m--> 140�[0m response �[38;5;241m=�[39m �[43msend_triton_request�[49m�[43m(�[49m
E �[1;32m 141�[0m �[43m �[49m�[43mschema�[49m�[43m,�[49m�[43m �[49m�[43mdf�[49m�[43m,�[49m�[43m �[49m�[43moutput_columns�[49m�[43m,�[49m�[43m �[49m�[43mclient�[49m�[38;5;241;43m=�[39;49m�[43mclient�[49m�[43m,�[49m�[43m �[49m�[43mtriton_model�[49m�[38;5;241;43m=�[39;49m�[43mmodel_name�[49m
E �[1;32m 142�[0m �[43m �[49m�[43m)�[49m
E �[1;32m 144�[0m �[38;5;28;01mreturn�[39;00m response
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:193�[0m, in �[0;36msend_triton_request�[0;34m(schema, df, outputs_list, client, endpoint, request_id, triton_model)�[0m
E �[1;32m 191�[0m outputs �[38;5;241m=�[39m [grpcclient�[38;5;241m.�[39mInferRequestedOutput(col) �[38;5;28;01mfor�[39;00m col �[38;5;129;01min�[39;00m outputs_list]
E �[1;32m 192�[0m �[38;5;28;01mwith�[39;00m client:
E �[0;32m--> 193�[0m response �[38;5;241m=�[39m �[43mclient�[49m�[38;5;241;43m.�[39;49m�[43minfer�[49m�[43m(�[49m�[43mtriton_model�[49m�[43m,�[49m�[43m �[49m�[43minputs�[49m�[43m,�[49m�[43m �[49m�[43mrequest_id�[49m�[38;5;241;43m=�[39;49m�[43mrequest_id�[49m�[43m,�[49m�[43m �[49m�[43moutputs�[49m�[38;5;241;43m=�[39;49m�[43moutputs�[49m�[43m)�[49m
E �[1;32m 195�[0m results �[38;5;241m=�[39m {}
E �[1;32m 196�[0m �[38;5;28;01mfor�[39;00m col �[38;5;129;01min�[39;00m outputs_list:
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/tritonclient/grpc/init.py:1322�[0m, in �[0;36mInferenceServerClient.infer�[0;34m(self, model_name, inputs, model_version, outputs, request_id, sequence_id, sequence_start, sequence_end, priority, timeout, client_timeout, headers, compression_algorithm)�[0m
E �[1;32m 1320�[0m �[38;5;28;01mreturn�[39;00m result
E �[1;32m 1321�[0m �[38;5;28;01mexcept�[39;00m grpc�[38;5;241m.�[39mRpcError �[38;5;28;01mas�[39;00m rpc_error:
E �[0;32m-> 1322�[0m �[43mraise_error_grpc�[49m�[43m(�[49m�[43mrpc_error�[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/tritonclient/grpc/init.py:62�[0m, in �[0;36mraise_error_grpc�[0;34m(rpc_error)�[0m
E �[1;32m 61�[0m �[38;5;28;01mdef�[39;00m �[38;5;21mraise_error_grpc�[39m(rpc_error):
E �[0;32m---> 62�[0m �[38;5;28;01mraise�[39;00m get_error_grpc(rpc_error) �[38;5;28;01mfrom�[39;00m �[38;5;28mNone�[39m
E
E �[0;31mInferenceServerException�[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found
E InferenceServerException: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found

../../../.local/lib/python3.8/site-packages/nbclient/client.py:862: CellExecutionError

During handling of the above exception, another exception occurred:

def test_func():
    with testbook(
        REPO_ROOT / "examples" / "scaling-criteo" / "02-ETL-with-NVTabular.ipynb",
        execute=False,
        timeout=180,
    ) as tb1:
        tb1.inject(
            """
            import os
            os.environ["BASE_DIR"] = "/tmp/input/criteo/"
            os.environ["INPUT_DATA_DIR"] = "/tmp/input/criteo/"
            os.environ["OUTPUT_DATA_DIR"] = "/tmp/output/criteo/"
            os.system("mkdir -p /tmp/input/criteo")
            os.system("mkdir -p /tmp/output/criteo")

            from merlin.datasets.synthetic import generate_data

            train, valid = generate_data("criteo", int(100000), set_sizes=(0.7, 0.3))

            train.to_ddf().compute().to_parquet('/tmp/input/criteo/day_0.parquet')
            valid.to_ddf().compute().to_parquet('/tmp/input/criteo/day_1.parquet')
            """
        )
        tb1.execute()
        assert os.path.isfile("/tmp/output/criteo/train/part_0.parquet")
        assert os.path.isfile("/tmp/output/criteo/valid/part_0.parquet")
        assert os.path.isfile("/tmp/output/criteo/workflow/metadata.json")

    with testbook(
        REPO_ROOT
        / "examples"
        / "scaling-criteo"
        / "03-Training-with-Merlin-Models-TensorFlow.ipynb",
        execute=False,
        timeout=180,
    ) as tb2:
        tb2.inject(
            """
            import os
            os.environ["INPUT_DATA_DIR"] = "/tmp/output/criteo/"
            """
        )
        tb2.execute()
        metrics = tb2.ref("eval_metrics")
        assert set(metrics.keys()) == set(
            [
                "auc",
                "binary_accuracy",
                "loss",
                "precision",
                "recall",
                "regularization_loss",
                "loss_batch",
            ]
        )
        assert os.path.isfile("/tmp/output/criteo/dlrm/saved_model.pb")

    with testbook(
        REPO_ROOT
        / "examples"
        / "scaling-criteo"
        / "04-Triton-Inference-with-Merlin-Models-TensorFlow.ipynb",
        execute=False,
        timeout=180,
    ) as tb3:
        tb3.inject(
            """
            import os
            os.environ["BASE_DIR"] = "/tmp/output/criteo/"
            os.environ["INPUT_FOLDER"] = "/tmp/input/criteo/"
            """
        )
        NUM_OF_CELLS = len(tb3.cells)
        tb3.execute_cell(list(range(0, NUM_OF_CELLS - 5)))
      tb3.inject(
            """
            import shutil
            from merlin.systems.triton.utils import run_ensemble_on_tritonserver
            outputs = ensemble.output_schema.column_names
            response = run_ensemble_on_tritonserver(
                "/tmp/output/criteo/ensemble/",workflow.input_schema, batch.fillna(0),
                outputs, "ensemble_model"
            )
            response = [x.tolist()[0] for x in response["label/binary_classification_task"]]
            shutil.rmtree("/tmp/input/criteo", ignore_errors=True)
            shutil.rmtree("/tmp/output/criteo", ignore_errors=True)
            """
        )

tests/unit/examples/test_scaling_criteo_merlin_models.py:84:


../../../.local/lib/python3.8/site-packages/testbook/client.py:237: in inject
cell = TestbookNode(self.execute_cell(inject_idx)) if run else TestbookNode(code_cell)


self = <testbook.client.TestbookNotebookClient object at 0x7f3b6f30a610>
cell = [31], kwargs = {}, cell_indexes = [31], executed_cells = [], idx = 31

def execute_cell(self, cell, **kwargs) -> Union[Dict, List[Dict]]:
    """
    Executes a cell or list of cells
    """
    if isinstance(cell, slice):
        start, stop = self._cell_index(cell.start), self._cell_index(cell.stop)
        if cell.step is not None:
            raise TestbookError('testbook does not support step argument')

        cell = range(start, stop + 1)
    elif isinstance(cell, str) or isinstance(cell, int):
        cell = [cell]

    cell_indexes = cell

    if all(isinstance(x, str) for x in cell):
        cell_indexes = [self._cell_index(tag) for tag in cell]

    executed_cells = []
    for idx in cell_indexes:
        try:
            cell = super().execute_cell(self.nb['cells'][idx], idx, **kwargs)
        except CellExecutionError as ce:
          raise TestbookRuntimeError(ce.evalue, ce, self._get_error_class(ce.ename))

E testbook.exceptions.TestbookRuntimeError: An error occurred while executing the following cell:
E ------------------
E
E import shutil
E from merlin.systems.triton.utils import run_ensemble_on_tritonserver
E outputs = ensemble.output_schema.column_names
E response = run_ensemble_on_tritonserver(
E "/tmp/output/criteo/ensemble/",workflow.input_schema, batch.fillna(0),
E outputs, "ensemble_model"
E )
E response = [x.tolist()[0] for x in response["label/binary_classification_task"]]
E shutil.rmtree("/tmp/input/criteo", ignore_errors=True)
E shutil.rmtree("/tmp/output/criteo", ignore_errors=True)
E
E ------------------
E
E �[0;31m---------------------------------------------------------------------------�[0m
E �[0;31mInferenceServerException�[0m Traceback (most recent call last)
E Cell �[0;32mIn [13], line 4�[0m
E �[1;32m 2�[0m �[38;5;28;01mfrom�[39;00m �[38;5;21;01mmerlin�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01msystems�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mtriton�[39;00m�[38;5;21;01m.�[39;00m�[38;5;21;01mutils�[39;00m �[38;5;28;01mimport�[39;00m run_ensemble_on_tritonserver
E �[1;32m 3�[0m outputs �[38;5;241m=�[39m ensemble�[38;5;241m.�[39moutput_schema�[38;5;241m.�[39mcolumn_names
E �[0;32m----> 4�[0m response �[38;5;241m=�[39m �[43mrun_ensemble_on_tritonserver�[49m�[43m(�[49m
E �[1;32m 5�[0m �[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43m/tmp/output/criteo/ensemble/�[39;49m�[38;5;124;43m"�[39;49m�[43m,�[49m�[43mworkflow�[49m�[38;5;241;43m.�[39;49m�[43minput_schema�[49m�[43m,�[49m�[43m �[49m�[43mbatch�[49m�[38;5;241;43m.�[39;49m�[43mfillna�[49m�[43m(�[49m�[38;5;241;43m0�[39;49m�[43m)�[49m�[43m,�[49m
E �[1;32m 6�[0m �[43m �[49m�[43moutputs�[49m�[43m,�[49m�[43m �[49m�[38;5;124;43m"�[39;49m�[38;5;124;43mensemble_model�[39;49m�[38;5;124;43m"�[39;49m
E �[1;32m 7�[0m �[43m)�[49m
E �[1;32m 8�[0m response �[38;5;241m=�[39m [x�[38;5;241m.�[39mtolist()[�[38;5;241m0�[39m] �[38;5;28;01mfor�[39;00m x �[38;5;129;01min�[39;00m response[�[38;5;124m"�[39m�[38;5;124mlabel/binary_classification_task�[39m�[38;5;124m"�[39m]]
E �[1;32m 9�[0m shutil�[38;5;241m.�[39mrmtree(�[38;5;124m"�[39m�[38;5;124m/tmp/input/criteo�[39m�[38;5;124m"�[39m, ignore_errors�[38;5;241m=�[39m�[38;5;28;01mTrue�[39;00m)
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:140�[0m, in �[0;36mrun_ensemble_on_tritonserver�[0;34m(tmpdir, schema, df, output_columns, model_name)�[0m
E �[1;32m 138�[0m response �[38;5;241m=�[39m �[38;5;28;01mNone�[39;00m
E �[1;32m 139�[0m �[38;5;28;01mwith�[39;00m run_triton_server(tmpdir) �[38;5;28;01mas�[39;00m client:
E �[0;32m--> 140�[0m response �[38;5;241m=�[39m �[43msend_triton_request�[49m�[43m(�[49m
E �[1;32m 141�[0m �[43m �[49m�[43mschema�[49m�[43m,�[49m�[43m �[49m�[43mdf�[49m�[43m,�[49m�[43m �[49m�[43moutput_columns�[49m�[43m,�[49m�[43m �[49m�[43mclient�[49m�[38;5;241;43m=�[39;49m�[43mclient�[49m�[43m,�[49m�[43m �[49m�[43mtriton_model�[49m�[38;5;241;43m=�[39;49m�[43mmodel_name�[49m
E �[1;32m 142�[0m �[43m �[49m�[43m)�[49m
E �[1;32m 144�[0m �[38;5;28;01mreturn�[39;00m response
E
E File �[0;32m~/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py:193�[0m, in �[0;36msend_triton_request�[0;34m(schema, df, outputs_list, client, endpoint, request_id, triton_model)�[0m
E �[1;32m 191�[0m outputs �[38;5;241m=�[39m [grpcclient�[38;5;241m.�[39mInferRequestedOutput(col) �[38;5;28;01mfor�[39;00m col �[38;5;129;01min�[39;00m outputs_list]
E �[1;32m 192�[0m �[38;5;28;01mwith�[39;00m client:
E �[0;32m--> 193�[0m response �[38;5;241m=�[39m �[43mclient�[49m�[38;5;241;43m.�[39;49m�[43minfer�[49m�[43m(�[49m�[43mtriton_model�[49m�[43m,�[49m�[43m �[49m�[43minputs�[49m�[43m,�[49m�[43m �[49m�[43mrequest_id�[49m�[38;5;241;43m=�[39;49m�[43mrequest_id�[49m�[43m,�[49m�[43m �[49m�[43moutputs�[49m�[38;5;241;43m=�[39;49m�[43moutputs�[49m�[43m)�[49m
E �[1;32m 195�[0m results �[38;5;241m=�[39m {}
E �[1;32m 196�[0m �[38;5;28;01mfor�[39;00m col �[38;5;129;01min�[39;00m outputs_list:
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/tritonclient/grpc/init.py:1322�[0m, in �[0;36mInferenceServerClient.infer�[0;34m(self, model_name, inputs, model_version, outputs, request_id, sequence_id, sequence_start, sequence_end, priority, timeout, client_timeout, headers, compression_algorithm)�[0m
E �[1;32m 1320�[0m �[38;5;28;01mreturn�[39;00m result
E �[1;32m 1321�[0m �[38;5;28;01mexcept�[39;00m grpc�[38;5;241m.�[39mRpcError �[38;5;28;01mas�[39;00m rpc_error:
E �[0;32m-> 1322�[0m �[43mraise_error_grpc�[49m�[43m(�[49m�[43mrpc_error�[49m�[43m)�[49m
E
E File �[0;32m/usr/local/lib/python3.8/dist-packages/tritonclient/grpc/init.py:62�[0m, in �[0;36mraise_error_grpc�[0;34m(rpc_error)�[0m
E �[1;32m 61�[0m �[38;5;28;01mdef�[39;00m �[38;5;21mraise_error_grpc�[39m(rpc_error):
E �[0;32m---> 62�[0m �[38;5;28;01mraise�[39;00m get_error_grpc(rpc_error) �[38;5;28;01mfrom�[39;00m �[38;5;28mNone�[39m
E
E �[0;31mInferenceServerException�[0m: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found
E InferenceServerException: [StatusCode.NOT_FOUND] Request for unknown model: 'ensemble_model' is not found

../../../.local/lib/python3.8/site-packages/testbook/client.py:135: TestbookRuntimeError
----------------------------- Captured stdout call -----------------------------
Signal (2) received.
----------------------------- Captured stderr call -----------------------------
2022-12-20 19:06:58,329 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-12-20 19:06:58,333 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-12-20 19:06:58,342 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
2022-12-20 19:06:58,444 - distributed.preloading - INFO - Import preload module: dask_cuda.initialize
/usr/lib/python3.8/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 48 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
2022-12-20 19:07:18.344247: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 19:07:22.453833: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 19:07:22.453937: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:07:22.454663: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 1
2022-12-20 19:07:22.454722: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-12-20 19:07:22.455344: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 2
2022-12-20 19:07:22.455392: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-12-20 19:07:22.455942: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 3
2022-12-20 19:07:22.455992: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/usr/lib/python3.8/logging/init.py", line 2127, in shutdown
h.close()
File "/usr/local/lib/python3.8/dist-packages/absl/logging/init.py", line 934, in close
self.stream.close()
File "/usr/local/lib/python3.8/dist-packages/ipykernel/iostream.py", line 438, in close
self.watch_fd_thread.join()
AttributeError: 'OutStream' object has no attribute 'watch_fd_thread'
2022-12-20 19:08:20.171738: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 19:08:24.201829: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 19:08:24.201928: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:08:24.202707: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 1
2022-12-20 19:08:24.202763: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:1 with 15149 MB memory: -> device: 1, name: Tesla P100-DGXS-16GB, pci bus id: 0000:08:00.0, compute capability: 6.0
2022-12-20 19:08:24.203430: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 2
2022-12-20 19:08:24.203477: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:2 with 15149 MB memory: -> device: 2, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0e:00.0, compute capability: 6.0
2022-12-20 19:08:24.204118: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 3
2022-12-20 19:08:24.204167: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:3 with 15149 MB memory: -> device: 3, name: Tesla P100-DGXS-16GB, pci bus id: 0000:0f:00.0, compute capability: 6.0
I1220 19:09:51.426606 16056 pinned_memory_manager.cc:240] Pinned memory pool is created at '0x7f4794000000' with size 268435456
I1220 19:09:51.427359 16056 cuda_memory_manager.cc:105] CUDA memory pool is created on device 0 with size 536870912
I1220 19:09:51.433164 16056 model_lifecycle.cc:459] loading: 0_transformworkflowtriton:1
I1220 19:09:51.433242 16056 model_lifecycle.cc:459] loading: 1_predicttensorflowtriton:1
I1220 19:09:51.433291 16056 model_lifecycle.cc:459] loading: executor_model:1
I1220 19:09:51.717761 16056 tensorflow.cc:2536] TRITONBACKEND_Initialize: tensorflow
I1220 19:09:51.717800 16056 tensorflow.cc:2546] Triton TRITONBACKEND API version: 1.10
I1220 19:09:51.717807 16056 tensorflow.cc:2552] 'tensorflow' TRITONBACKEND API version: 1.10
I1220 19:09:51.717813 16056 tensorflow.cc:2576] backend configuration:
{"cmdline":{"auto-complete-config":"true","backend-directory":"/opt/tritonserver/backends","min-compute-capability":"6.000000","version":"2","default-max-batch-size":"4"}}
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
I1220 19:10:09.048521 16056 tensorflow.cc:2642] TRITONBACKEND_ModelInitialize: 1_predicttensorflowtriton (version 1)
2022-12-20 19:10:09.050321: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/output/criteo/ensemble/1_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:10:09.075935: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 19:10:09.075989: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/output/criteo/ensemble/1_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:10:09.076142: I tensorflow/core/platform/cpu_feature_guard.cc:194] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: SSE3 SSE4.1 SSE4.2 AVX
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 19:10:09.479215: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 19:10:09.479335: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13376 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:10:09.534466: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:354] MLIR V1 optimization pass is not enabled
2022-12-20 19:10:09.540798: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 19:10:09.671822: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/output/criteo/ensemble/1_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:10:09.729213: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 678927 microseconds.
I1220 19:10:09.751535 16056 python_be.cc:1767] TRITONBACKEND_ModelInstanceInitialize: 0_transformworkflowtriton (GPU device 0)
I1220 19:10:16.880647 16056 python_be.cc:1767] TRITONBACKEND_ModelInstanceInitialize: executor_model (GPU device 0)
I1220 19:10:16.883237 16056 model_lifecycle.cc:693] successfully loaded '0_transformworkflowtriton' version 1
2022-12-20 19:10:25.781878: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-12-20 19:10:27.938863: I tensorflow/core/common_runtime/gpu/gpu_process_state.cc:222] Using CUDA malloc Async allocator for GPU: 0
2022-12-20 19:10:27.939034: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 8139 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
I1220 19:10:37.459762 16056 tensorflow.cc:2691] TRITONBACKEND_ModelInstanceInitialize: 1_predicttensorflowtriton (GPU device 0)
I1220 19:10:37.460219 16056 model_lifecycle.cc:693] successfully loaded 'executor_model' version 1
2022-12-20 19:10:37.460382: I tensorflow/cc/saved_model/reader.cc:43] Reading SavedModel from: /tmp/output/criteo/ensemble/1_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:10:37.474900: I tensorflow/cc/saved_model/reader.cc:81] Reading meta graph with tags { serve }
2022-12-20 19:10:37.474958: I tensorflow/cc/saved_model/reader.cc:122] Reading SavedModel debug info (if present) from: /tmp/output/criteo/ensemble/1_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:10:37.477035: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13376 MB memory: -> device: 0, name: Tesla P100-DGXS-16GB, pci bus id: 0000:07:00.0, compute capability: 6.0
2022-12-20 19:10:37.512089: I tensorflow/cc/saved_model/loader.cc:230] Restoring SavedModel bundle.
2022-12-20 19:10:37.644963: I tensorflow/cc/saved_model/loader.cc:214] Running initialization op on SavedModel bundle at path: /tmp/output/criteo/ensemble/1_predicttensorflowtriton/1/model.savedmodel
2022-12-20 19:10:37.706079: I tensorflow/cc/saved_model/loader.cc:321] SavedModel load for tags { serve }; Status: success: OK. Took 245704 microseconds.
I1220 19:10:37.706654 16056 model_lifecycle.cc:693] successfully loaded '1_predicttensorflowtriton' version 1
I1220 19:10:37.706805 16056 server.cc:561]
+------------------+------+
| Repository Agent | Path |
+------------------+------+
+------------------+------+

I1220 19:10:37.706923 16056 server.cc:588]
+------------+-----------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Backend | Path | Config |
+------------+-----------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| python | /opt/tritonserver/backends/python/libtriton_python.so | {"cmdline":{"auto-complete-config":"true","min-compute-capability":"6.000000","backend-directory":"/opt/tritonserver/backends","default-max-batch-size":"4"}} |
| tensorflow | /opt/tritonserver/backends/tensorflow2/libtriton_tensorflow2.so | {"cmdline":{"auto-complete-config":"true","backend-directory":"/opt/tritonserver/backends","min-compute-capability":"6.000000","version":"2","default-max-batch-size":"4"}} |
+------------+-----------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I1220 19:10:37.707052 16056 server.cc:631]
+---------------------------+---------+--------+
| Model | Version | Status |
+---------------------------+---------+--------+
| 0_transformworkflowtriton | 1 | READY |
| 1_predicttensorflowtriton | 1 | READY |
| executor_model | 1 | READY |
+---------------------------+---------+--------+

I1220 19:10:37.771756 16056 metrics.cc:650] Collecting metrics for GPU 0: Tesla P100-DGXS-16GB
I1220 19:10:37.772611 16056 tritonserver.cc:2214]
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Option | Value |
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| server_id | triton |
| server_version | 2.25.0 |
| server_extensions | classification sequence model_repository model_repository(unload_dependents) schedule_policy model_configuration system_shared_memory cuda_shared_memory binary_tensor_data statistics trace |
| model_repository_path[0] | /tmp/output/criteo/ensemble/ |
| model_control_mode | MODE_NONE |
| strict_model_config | 0 |
| rate_limit | OFF |
| pinned_memory_pool_byte_size | 268435456 |
| cuda_memory_pool_byte_size{0} | 536870912 |
| response_cache_byte_size | 0 |
| min_supported_compute_capability | 6.0 |
| strict_readiness | 1 |
| exit_timeout | 30 |
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I1220 19:10:37.774110 16056 grpc_server.cc:4610] Started GRPCInferenceService at localhost:8001
I1220 19:10:37.774317 16056 http_server.cc:3316] Started HTTPService at 0.0.0.0:8000
I1220 19:10:37.815204 16056 http_server.cc:178] Started Metrics Service at 0.0.0.0:8002
W1220 19:10:38.791723 16056 metrics.cc:468] Unable to get energy consumption for GPU 0. Status:Success, value:0
W1220 19:10:39.791932 16056 metrics.cc:468] Unable to get energy consumption for GPU 0. Status:Success, value:0
W1220 19:10:40.811520 16056 metrics.cc:468] Unable to get energy consumption for GPU 0. Status:Success, value:0
I1220 19:11:07.377114 16056 server.cc:262] Waiting for in-flight requests to complete.
I1220 19:11:07.377146 16056 server.cc:278] Timeout 30: Found 0 model versions that have in-flight inferences
I1220 19:11:07.377381 16056 server.cc:293] All models are stopped, unloading models
I1220 19:11:07.377385 16056 tensorflow.cc:2729] TRITONBACKEND_ModelInstanceFinalize: delete instance state
I1220 19:11:07.377436 16056 server.cc:300] Timeout 30: Found 3 live models and 0 in-flight non-inference requests
I1220 19:11:07.377641 16056 tensorflow.cc:2668] TRITONBACKEND_ModelFinalize: delete model state
I1220 19:11:07.407482 16056 model_lifecycle.cc:578] successfully unloaded '1_predicttensorflowtriton' version 1
I1220 19:11:08.377560 16056 server.cc:300] Timeout 29: Found 2 live models and 0 in-flight non-inference requests
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.12) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
I1220 19:11:09.373346 16056 model_lifecycle.cc:578] successfully unloaded '0_transformworkflowtriton' version 1
I1220 19:11:09.377688 16056 server.cc:300] Timeout 28: Found 1 live models and 0 in-flight non-inference requests
I1220 19:11:09.880976 16056 model_lifecycle.cc:578] successfully unloaded 'executor_model' version 1
I1220 19:11:10.377810 16056 server.cc:300] Timeout 27: Found 0 live models and 0 in-flight non-inference requests
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/usr/lib/python3.8/logging/init.py", line 2127, in shutdown
h.close()
File "/usr/local/lib/python3.8/dist-packages/absl/logging/init.py", line 934, in close
self.stream.close()
File "/usr/local/lib/python3.8/dist-packages/ipykernel/iostream.py", line 438, in close
self.watch_fd_thread.join()
AttributeError: 'OutStream' object has no attribute 'watch_fd_thread'
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33
/usr/local/lib/python3.8/dist-packages/dask_cudf/core.py:33: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
DASK_VERSION = LooseVersion(dask.version)

.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: 34 warnings
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/setuptools/_distutils/version.py:346: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
other = LooseVersion(other)

tests/unit/examples/test_z_legacy_notebooks.py::test_movielens_example
tests/unit/examples/test_z_legacy_notebooks.py::test_movielens_example
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/nvtabular/tools/data_gen.py:126: DeprecationWarning: np.long is a deprecated alias for np.compat.long. To silence this warning, use np.compat.long by itself. In the likely event your code does not need to work on Python 2 you can use the builtin int for which np.compat.long is itself an alias. Doing this will not modify any behaviour and is safe. When replacing np.long, you may wish to use e.g. np.int64 or np.int32 to specify the precision. If you wish to review your current use, check the release note link for additional information.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
col_size + 1, dtype=np.long, min_val=col.multi_min, max_val=col.multi_max

tests/unit/examples/test_z_legacy_notebooks.py: 12 warnings
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/nvtabular/tools/data_gen.py:140: DeprecationWarning: np.long is a deprecated alias for np.compat.long. To silence this warning, use np.compat.long by itself. In the likely event your code does not need to work on Python 2 you can use the builtin int for which np.compat.long is itself an alias. Doing this will not modify any behaviour and is safe. When replacing np.long, you may wish to use e.g. np.int64 or np.int32 to specify the precision. If you wish to review your current use, check the release note link for additional information.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
col_size, dtype=np.long, min_val=col.min_val, max_val=col.cardinality

tests/unit/examples/test_z_legacy_notebooks.py::test_movielens_example
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/nvtabular/loader/init.py:19: DeprecationWarning: The nvtabular.loader module has moved to a new repository, at https://github.com/NVIDIA-Merlin/dataloader . Support for importing from nvtabular.loader is deprecated, and will be removed in a future version. Please update your imports to refer to merlinloader.
warnings.warn(

tests/unit/examples/test_z_legacy_notebooks.py::test_movielens_example
/var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/init.py:23: DeprecationWarning: The merlin.loader package has been moved to merlin.dataloader. Please update your imports, importing from merlin.loader is deprecated and will be removed in a future version
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Cover

.tox/test-gpu/lib/python3.8/site-packages/merlin/core/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/_version.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/compat.py 10 4 60%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/dispatch.py 366 166 55%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/protocols.py 99 45 55%
.tox/test-gpu/lib/python3.8/site-packages/merlin/core/utils.py 197 98 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/init.py 5 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/base_operator.py 121 15 88%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/dictarray.py 55 31 44%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/executors.py 141 26 82%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/graph.py 99 24 76%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/node.py 344 110 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/concat_columns.py 17 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/selection.py 22 1 95%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/subset_columns.py 12 2 83%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/ops/subtraction.py 21 11 48%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dag/selector.py 101 26 74%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/_version.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/loader_base.py 471 104 78%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/tensorflow.py 114 38 67%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/tf_utils.py 57 27 53%
.tox/test-gpu/lib/python3.8/site-packages/merlin/dataloader/torch.py 66 33 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/aliccp/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/aliccp/dataset.py 141 102 28%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/booking/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/booking/dataset.py 127 100 21%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/dressipi/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/ecommerce/dressipi/dataset.py 45 37 18%
.tox/test-gpu/lib/python3.8/site-packages/merlin/datasets/synthetic.py 155 61 61%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/csv.py 57 11 81%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dask.py 181 96 47%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataframe_engine.py 61 36 41%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataframe_iter.py 21 2 90%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset.py 347 123 65%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/dataset_engine.py 37 8 78%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/fsspec_utils.py 127 108 15%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/hugectr.py 45 35 22%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/parquet.py 624 335 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/shuffle.py 38 12 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/worker.py 80 11 86%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/writer.py 190 65 66%
.tox/test-gpu/lib/python3.8/site-packages/merlin/io/writer_factory.py 18 5 72%
.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/tensorflow.py 1 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/loader/torch.py 1 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/_version.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/api.py 14 5 64%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/config/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/config/schema.py 62 16 74%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/io.py 17 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/loader/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/loader/backend.py 16 5 69%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/init.py 69 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/cross.py 44 28 36%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/dlrm.py 49 8 84%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/experts.py 158 122 23%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/interaction.py 108 55 49%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/mlp.py 117 58 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/optimizer.py 173 127 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/retrieval/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/retrieval/base.py 175 95 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/retrieval/matrix_factorization.py 35 19 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/retrieval/two_tower.py 30 4 87%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/base.py 29 7 76%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/cross_batch.py 46 31 33%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/in_batch.py 35 12 66%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/blocks/sampling/queue.py 115 99 14%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/aggregation.py 241 107 56%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/base.py 242 108 55%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/combinators.py 426 150 65%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/encoder.py 182 125 31%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/index.py 106 71 33%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/prediction.py 50 19 62%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/core/tabular.py 280 71 75%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/distributed/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/distributed/backend.py 9 2 78%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/inputs/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/inputs/base.py 64 39 39%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/inputs/continuous.py 39 4 90%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/inputs/embedding.py 458 154 66%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/loader.py 128 70 45%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/losses/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/losses/base.py 9 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/losses/listwise.py 13 1 92%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/losses/pairwise.py 115 57 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/metrics/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/metrics/topk.py 198 82 59%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/base.py 782 350 55%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/benchmark.py 16 6 62%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/ranking.py 67 43 36%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/retrieval.py 78 44 44%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/models/utils.py 10 2 80%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/base.py 123 90 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/classification.py 91 51 44%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/contrastive.py 147 107 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/regression.py 9 2 78%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/sampling/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/sampling/base.py 78 41 47%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/sampling/in_batch.py 37 22 41%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/sampling/popularity.py 27 17 37%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/outputs/topk.py 98 63 36%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/base.py 207 108 48%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/classification.py 68 22 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/multi.py 7 1 86%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/next_item.py 59 33 44%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/regression.py 35 19 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/prediction_tasks/retrieval.py 73 31 58%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transformers/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transformers/block.py 102 55 46%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transformers/transforms.py 87 29 67%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/bias.py 111 77 31%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/features.py 435 346 20%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/noise.py 43 28 35%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/regularization.py 17 6 65%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/sequence.py 302 227 25%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/transforms/tensor.py 165 79 52%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/typing.py 7 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/batch_utils.py 85 12 86%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/repr_utils.py 69 48 30%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/search_utils.py 34 22 35%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/tf/utils/tf_utils.py 209 141 33%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/constants.py 3 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/dataset.py 38 18 53%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/dependencies.py 26 19 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/doc_utils.py 10 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/example_utils.py 31 8 74%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/misc_utils.py 118 90 24%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/nvt_utils.py 27 24 11%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/registry.py 101 31 69%
.tox/test-gpu/lib/python3.8/site-packages/merlin/models/utils/schema_utils.py 90 39 57%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/init.py 2 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/io/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/io/proto_utils.py 20 4 80%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/io/schema_bp.py 306 7 98%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/io/tensorflow_metadata.py 190 33 83%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/schema.py 229 55 76%
.tox/test-gpu/lib/python3.8/site-packages/merlin/schema/tags.py 82 5 94%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/init.py 6 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/_version.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/init.py 4 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/dictarray.py 172 116 33%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ensemble.py 46 3 93%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/node.py 23 2 91%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/op_runner.py 26 19 27%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/init.py 11 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/compat.py 32 8 75%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/faiss.py 77 14 82%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/feast.py 118 56 53%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/fil.py 221 125 43%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/operator.py 79 32 59%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/pytorch.py 49 32 35%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/session_filter.py 45 28 38%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/softmax_sampling.py 51 21 59%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/tensorflow.py 73 23 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/unroll_features.py 50 21 58%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/ops/workflow.py 39 11 72%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/init.py 1 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/base_runtime.py 11 2 82%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/init.py 1 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/fil.py 95 66 31%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/operator.py 12 1 92%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/pytorch.py 62 37 40%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/tensorflow.py 53 4 92%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/ops/workflow.py 47 15 68%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/dag/runtimes/triton/runtime.py 90 4 96%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/model_registry.py 16 8 50%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/init.py 49 17 65%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/conversions.py 143 120 16%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/export.py 268 210 22%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/models/init.py 0 0 100%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/triton/utils.py 72 19 74%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/workflow/init.py 22 20 9%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/workflow/base.py 113 58 49%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/workflow/hugectr.py 37 29 22%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/workflow/pytorch.py 10 6 40%
.tox/test-gpu/lib/python3.8/site-packages/merlin/systems/workflow/tensorflow.py 32 25 22%

TOTAL 15719 7266 54%

=========================== short test summary info ============================
SKIPPED [1] tests/unit/examples/test_scaling_criteo_merlin_models_hugectr.py:7: could not import 'hugectr': No module named 'hugectr'
======= 2 failed, 4 passed, 1 skipped, 51 warnings in 516.14s (0:08:36) ========
ERROR: InvocationError for command /var/jenkins_home/workspace/merlin_merlin/merlin/.tox/test-gpu/bin/python -m pytest --cov-report term --cov merlin -rxs tests/unit (exited with code 1)
___________________________________ summary ____________________________________
ERROR: test-gpu: commands failed
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/Merlin/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[workspace] $ /bin/bash /tmp/jenkins11468554560138612834.sh

@jperez999 jperez999 merged commit f0842d3 into NVIDIA-Merlin:main Dec 20, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
22.12 chore Infrastructure update ci
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants