Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tags created by components inside combiner don't propagate #1927

Closed
gmattar opened this issue Jun 8, 2020 · 17 comments
Closed

Tags created by components inside combiner don't propagate #1927

gmattar opened this issue Jun 8, 2020 · 17 comments
Assignees
Labels
bug triage Needs to be triaged and prioritised accordingly

Comments

@gmattar
Copy link

gmattar commented Jun 8, 2020

image

Basically, all models create tags and the Transformer Output needs to read them before forwarding the result of Model 3. I'm using Seldon 1.1 with REST requests.

The problem is that I can only see the tags from Model 3 inside Transformer.
These are the steps I took and what I found after inspecting the components:

  • First I included a breakpoint inside aggregate but was not able to see the tags, the meta parameter was empty. Then, I included an aggregate_raw function and was able to see them. meta is not passed to client_aggregate in this scenario:

    client_response = client_aggregate(user_model, features_list, names_list)

  • I changed SELDON_LOG_LEVEL to DEBUG and checked the response sent by the Combiner component. The tags were passed to executor

  • Next, I checked the meta parameter passed to Model 3 and the metas were empty again.

  • Last, I checked the tags in Transformer Ouput and only the data of Model 3 was available

Originally posted by @gmattar in #1474 (comment)


@gmattar It seems that aggregate methods are not passing meta argument which contain tags.
Could you open a new issue for this and add information what version of wrapper are you using?

Originally posted by @RafalSkolasinski in #1474 (comment)


I'm building my own custom image instead of using a wrapper:

FROM continuumio/miniconda3:4.8.2

WORKDIR /microservice

ENV MODEL_NAME BaseComponent
ENV API_TYPE REST
ENV SERVICE_TYPE UNDEFINED
ENV PERSISTENCE 0

RUN pip install seldon-core==1.1.0

EXPOSE 5000

CMD ["sh", "-c", "seldon-core-microservice $MODEL_NAME $API_TYPE --service-type $SERVICE_TYPE --persistence $PERSISTENCE"]
@gmattar gmattar added bug triage Needs to be triaged and prioritised accordingly labels Jun 8, 2020
@RafalSkolasinski
Copy link
Contributor

@gmattar thanks for reporting this - I will look into it and get back to you

@RafalSkolasinski RafalSkolasinski self-assigned this Jun 9, 2020
@RafalSkolasinski
Copy link
Contributor

@gmattar Would you be so kind to provide a graph section of your SeldonDeployment manifest?

@RafalSkolasinski
Copy link
Contributor

The reason I ask is because up to my knowledge the only component that can go in the request flow after combiner is output transformer and I am not sure if the graph definition would correspond to what you wanted to define.

@gmattar
Copy link
Author

gmattar commented Jun 12, 2020

this is the full graph (I simplified it a bit in the issue description):

graph:
        name: response-generator
        type: OUTPUT_TRANSFORMER
        endpoint:
            type: REST
        children:
        - name: model-a
          type: OUTPUT_TRANSFORMER
          endpoint:
            type: REST
          children:
          - name: to-tensor
            type: OUTPUT_TRANSFORMER
            endpoint:
              type: REST
            children:
            - name: image-crop
              type: COMBINER
              endpoint:
                type: REST
              children:
              - name: downloader-1
                type: TRANSFORMER
                endpoint:
                  type: REST
                children: []
              - name: downloader-2
                type: TRANSFORMER
                endpoint:
                  type: REST
                children:
                - name: to-tensor
                  type: TRANSFORMER
                  endpoint:
                      type: REST
                  children:
                  - name: model-2
                    type: MODEL
                    endpoint:
                      type: REST
                    children: []

Also, this thread in Seldon slack channel has a bit more information about my graph:
https://seldondev.slack.com/archives/C8Y9A8G0Y/p1590116481237900

@RafalSkolasinski
Copy link
Contributor

@gmattar Unfortunately I am unable to reproduce the issue.
In this notebook is what I tried.

The request path seems sensible and I always see all tags created in previous steps: both in logs and in returned output.

@RafalSkolasinski
Copy link
Contributor

The only thing I can confirm is that inside aggregate function meta is [] because as you pointed out client_aggregate does not receive this argument. However tags set in the tags method in combiner node do propagate to further steps.

@gmattar
Copy link
Author

gmattar commented Jun 17, 2020

Do you think this could have been fixed in 1.1.1-rc?
Or maybe because you're using seldon-core-s2i-python37 and I have a custom image? Something like this already happened to me before.

Do you have any guide on how to debug the orchestrator?

@ukclivecox
Copy link
Contributor

@RafalSkolasinski
Copy link
Contributor

As far as I am aware this should work with seldon-core==1.1.0 - you can try to install master in your image just to be sure.

Aggregation and further propagation of tags should now be happening on the Python wrapper.

@RafalSkolasinski
Copy link
Contributor

@gmattar Do you have any update on this? As we couldn't reproduce the problem can we close the issue?

As a final check maybe you can try to see if you can reproduce the issue in minimal setup like the notebook I linked?

@gmattar
Copy link
Author

gmattar commented Jun 23, 2020

Sorry, I didn't have time yet, end of quarters are messy. I'll update Seldon to 1.2 and test next week

@RafalSkolasinski
Copy link
Contributor

No worries @gmattar, we can keep this one open until you can confirm.

@RafalSkolasinski
Copy link
Contributor

@gmattar As I couldn't replicate this issue on my side can we close it?
If you can provide a minimal example showing the problem we could re-open it.

@divyadilip91
Copy link

The reason I ask is because up to my knowledge the only component that can go in the request flow after combiner is output transformer and I am not sure if the graph definition would correspond to what you wanted to define.

#2277 @RafalSkolasinski Hi i am facing an issue with connecting a combiner to a model. Is this possible? are there any docs available about the list of components that can and cannot be connected to other components? I was also wondering if I could connect an output transformer to a model? Please do reply.

@RafalSkolasinski
Copy link
Contributor

@divyadilip91 let's follow in another issue if there is a problem but few things are worth mentioning:

@divyadilip91
Copy link

divyadilip91 commented Aug 21, 2020

@RafalSkolasinski Thanks for your reply. Yes I have ensured that the seldon core installed in the cluster and the models are run using the same version of seldon.
Thanks for the graph details, but still some things are unclear..I know that combiner output can be connected to an output transformer ,but my main doubt was whether I could connect the output from a combiner to be passed as input to another model/input transformer?

@RafalSkolasinski
Copy link
Contributor

The only node than can be present after the combiner AFAIK is the output transformer, unfortunately.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug triage Needs to be triaged and prioritised accordingly
Projects
None yet
Development

No branches or pull requests

4 participants