Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix JsonInput micro-batching support #860

Merged
merged 8 commits into from
Jul 2, 2020
Merged

Fix JsonInput micro-batching support #860

merged 8 commits into from
Jul 2, 2020

Conversation

bojiang
Copy link
Member

@bojiang bojiang commented Jul 1, 2020

Description

  • Fix the inconsistency of current JsonInput API for handling single prediction request and handling batch request
  • Added legacy JSON input class for an easy tranistion

Motivation and Context

How Has This Been Tested?

Types of changes

  • Breaking change (fix or feature that would cause existing functionality to change)
  • New feature and improvements (non-breaking change which adds/improves functionality)
  • Bug fix (non-breaking change which fixes an issue)
  • Code Refactoring (internal change which is not user facing)
  • Documentation
  • Test, CI, or build

Component(s) if applicable

  • BentoService (service definition, dependency management, API input/output adapters)
  • Model Artifact (model serialization, multi-framework support)
  • Model Server (mico-batching, dockerisation, logging, OpenAPI, instruments)
  • YataiService gRPC server (model registry, cloud deployment automation)
  • YataiService web server (nodejs HTTP server and web UI)
  • Internal (BentoML's own configuration, logging, utility, exception handling)
  • BentoML CLI

Checklist:

  • My code follows the bentoml code style, both ./dev/format.sh and
    ./dev/lint.sh script have passed
    (instructions).
  • My change reduces project test coverage and requires unit tests to be added
  • I have added unit tests covering my code change
  • My change requires a change to the documentation
  • I have updated the documentation accordingly

@bojiang bojiang requested a review from parano July 1, 2020 08:54
raise BadInput(
"Request content-type must be 'application/json' for this "
"BentoService API"
)

result = func(parsed_json)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At least it should be func([parsed_json])

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will be a regression right? Should we consider something like the LegacyImageHandler?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LegacyJsonInput?

@pep8speaks
Copy link

pep8speaks commented Jul 1, 2020

Hello @bojiang, Thanks for updating this PR.

There are currently no PEP 8 issues detected in this PR. Cheers! 🍻

Comment last updated at 2020-07-01 12:08:00 UTC

@bojiang bojiang changed the title [FIX] JsonInput handle_request Update JsonInput Jul 1, 2020

--->

```
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pls help to refine instructions here @parano

@codecov
Copy link

codecov bot commented Jul 1, 2020

Codecov Report

Merging #860 into master will increase coverage by 1.30%.
The diff coverage is 77.77%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #860      +/-   ##
==========================================
+ Coverage   59.40%   60.71%   +1.30%     
==========================================
  Files         117      118       +1     
  Lines        7922     7959      +37     
==========================================
+ Hits         4706     4832     +126     
+ Misses       3216     3127      -89     
Impacted Files Coverage Δ
bentoml/adapters/json_input.py 56.36% <40.00%> (+1.00%) ⬆️
bentoml/adapters/legacy_json_input.py 81.08% <81.08%> (ø)
bentoml/adapters/__init__.py 100.00% <100.00%> (ø)
bentoml/marshal/utils.py 60.27% <100.00%> (ø)
bentoml/saved_bundle/pip_pkg.py 94.53% <0.00%> (-1.57%) ⬇️
bentoml/utils/alg.py 34.21% <0.00%> (+5.26%) ⬆️
bentoml/utils/__init__.py 71.64% <0.00%> (+7.46%) ⬆️
bentoml/marshal/dispatcher.py 83.67% <0.00%> (+61.22%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 207d831...ceb0d3d. Read the comment docs.

@parano parano changed the title Update JsonInput Fix JsonInput micro-batching support Jul 1, 2020
@parano parano merged commit 0846b6b into bentoml:master Jul 2, 2020
@bojiang bojiang deleted the fix branch July 2, 2020 04:57
aarnphm pushed a commit to aarnphm/BentoML that referenced this pull request Jul 29, 2022
* [FIX] JsonInput handle_request

* Add LegacyJsonInput

* [STYLE]

* [STYLE] unused "err"

* [STYLE] deprecated decodestring

* [CI] pytest version

* [CI] Add test LegacyJsonInput

* [CI] asyncio tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants