Fix logging in the TransT serverless function #6290
Merged
+35
−61
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Motivation and context
The Docker image for this function includes Conda, and uses
conda run
to run the Nuclio processor. Unfortunately,conda run
buffers the entire output of the child process until that process exits, and since the processor never exits, its logs are never printed (and slowly consume memory).Conda is actually completely useless here, so just get rid of it, which fixes the problem.
In addition, apply a few other improvements:
Synchronize the PyTorch and TransT versions between the CPU and GPU variants.
Replace
opencv-python
withopencv-python-headless
, which has fewer dependencies.Use the
ADD
command instead of wget.Altogether, these improvements shave ~780 MB off the size of the CPU image (didn't check for the GPU one).
How has this been tested?
Checklist
develop
branch[ ] I have updated the documentation accordingly[ ] I have added tests to cover my changes[ ] I have linked related issues (see GitHub docs)[ ] I have increased versions of npm packages if it is necessary(cvat-canvas,
cvat-core,
cvat-data and
cvat-ui)
License
Feel free to contact the maintainers if that's a concern.