Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is interface validation wrong with import_sdl? #1088

Closed
wants to merge 1 commit into from
Closed

Is interface validation wrong with import_sdl? #1088

wants to merge 1 commit into from

Conversation

mattbaker
Copy link
Contributor

This is a question/maybe-bug but I'm submitting as a PR to include my testing code. Let me know if you'd like me to move this over to an issue.

We recently received a schema addition with a usage of interfaces that's atypical for us. It looks something like this:

    interface PlayerInterface {
      metadata: PlayerMetadataInterface
    }

    interface PlayerMetadataInterface {
      displayName: String
    }

    type HumanPlayer implements PlayerInterface {
      metadata: HumanMetadata
    }

    type HumanMetadata implements PlayerMetadataInterface {
      displayName: String
    }

I think the thing that caught my eye is that HumanPlayer implements PlayerInterface but uses the type HumanMetadata for the metadata field. Since HumanMetadata implements PlayerMetadataInterface I think... it... should work? Honestly I'm not sure.

What I'm curious about is a possible discrepancy between defining a schema with import_sdl vs. the DSL.

If I import an SDL with this schema I get an error I'm familiar with:

== Compilation error in file test/absinthe/schema/notation/experimental/import_sdl_test.exs ==
** (Absinthe.Schema.Error) Compilation failed:
---------------------------------------
## Locations
/Users/mbaker/code/absinthe/test/absinthe/schema/notation/experimental/import_sdl_test.exs:23

Type "human_player" does not fully implement interface type "player_interface" for fields [:metadata]

An object type must be a super-set of all interfaces it implements.

* The object type must include a field of the same name for every field
  defined in an interface.
  * The object field must be of a type which is equal to or a sub-type of
    the interface field (covariant).
  * An object field type is a valid sub-type if it is equal to (the same
    type as) the interface field type.
  * An object field type is a valid sub-type if it is an Object type and the
    interface field type is either an Interface type or a Union type and the
    object field type is a possible type of the interface field type.
  * An object field type is a valid sub-type if it is a List type and the
    interface field type is also a List type and the list-item type of the
    object field type is a valid sub-type of the list-item type of the
    interface field type.
  * An object field type is a valid sub-type if it is a Non-Null variant of a
    valid sub-type of the interface field type.
* The object field must include an argument of the same name for every
  argument defined in the interface field.
  * The object field argument must accept the same type (invariant) as the
    interface field argument.
* The object field may include additional arguments not defined in the
  interface field, but any additional argument must not be required.

If I do it via the DSL Absinthe seems to be ok with it (the sample test passes at least). This is backed up by the fact that the actual schema this example is derived from is in an Absinthe service (expressed using the DSL) and compiles just fine. My apologies if this is unclear, it's sort of hard to describe!

I think my first question is, am I right that we're seeing a discrepancy here? And if there is a discrepancy, is the schema truly valid or truly invalid?

As an aside, I realize the resolve_type functions are just returning nil but I don't think it should matter here, right?

@jerelmiller
Copy link
Contributor

If I do it via the DSL Absinthe seems to be ok with it

I can confirm this is the case. Our "real one" (as mentioned here) uses the DSL to implement something that is very similar to the example above. We have this particular schema definition deployed successfully to production.

@emeryotopalik
Copy link
Contributor

So, the spec says "Any type that implements an interface must define all the fields with names and types exactly matching". This would indicate that if PlayerInterface.metadata is of type PlayerMetadataInterface, then HumanInterface.metadata would also need to be PlayerMetadataInterface, not HumanPlayerMetadata.

And I think that is correct. Saying PlayerInterface.metadata returns PlayerMetadataInterface means that it could resolve to any type that implements PlayerMetadataInterface. Which is different than saying HumanPlayer.metadata returns HumanMetadata. That would mean that the parent type is informing the resolve_type rather than the type itself.

I don't know if I am making any sense... this is very loopy.

@mattbaker
Copy link
Contributor Author

Awesome, that reading of the spec makes sense and seems clear now that you laid it out and I think you're right.

I think I understand what you're saying in the second paragraph too. It's sort of like HumanPlayer is applying an invalid constraint.

That makes me think of an example that might show it, imagine a type implementing the metadata interface called RobotMetadata, and, say, a root query field called humanPlayer specifically of type HumanPlayer.

For a query like this:

humanPlayer {
  metadata {
    ... on HumanMetadata {
      # etc
    }
    ... on RobotMetadata {
      # etc 
    }
  }
}

PlayerInterface says that's valid because the metadata field is the PlayerMetadataInterface, but HumanPlayer contradicts that, the field humanPlayer is explicitly a HumanPlayer so that inline fragment doesn't make sense. How could HumanMetadata be RobotMetadata?

So, maybe I should close this and the associated issue and open a new one outlining what might be the real problem: the validation when you use the absinthe DSL is too permissive and should be throwing an error here.

Would you agree @emeryotopalik?

@jerelmiller what do you think about all this?

Man, my brain is tied in a knot 😂

@jerelmiller
Copy link
Contributor

My brain also hurts from this 🤣 . These are solid points and I can definitely agree with all of this.

Let me present a different viewpoint and how I had initially interpreted this. The more I read it, it feels more subjective and up to interpretation than I initially thought 😆

Any type that implements an interface must define all the fields with names and types exactly matching

This makes sense! This constraint allows for polymorphism, which also means consumers don't have to think about handling common fields in different ways. Where this gets really interesting is how I typically think of types in relation to their interfaces.

Typically when I think of a type implementing an interface, I like to use the phrase "is a". Coming from an object-oriented background, it was drilled into me that you can think of a subtype as also being its supertype (i.e. a Car "is a" Vehicle, where Car is the subtype and Vehicle is the supertype). In the same vein, you can reference interfaces. In this case HumanPlayer "is a" PlayerInterface because its got a shared commonality and shared behavior. While you can't guarantee all PlayerInterfaces are HumanPlayers, you can say that all HumanPlayers are PlayerInterfaces.

This way of thinking leads to an interesting way of interpreting the spec:

types exactly matching

I suppose this comes down to how you interpret "exactly matching". This makes a lot of sense when you're talking about concrete types. You wouldn't want a userId field to be both an Integer and a String. The types are not interchangeable. You want to be able to guarantee that when you query for a particular field, that you are going to get back a value that matches the type.

This gets a bit muddy in my head when you talk about type matching on interface types. Using "is a", you can say that the concrete type is interchangeable for its interface. We can guarantee that HumanPlayer implements the PlayerInterface because we can guarantee that it's metadata adheres to the PlayerInterfaceMetadata. Here is how you would query it

# Let's say there is a `players` field that returns `[PlayerInterface!]!`
players {
  metadata {
    # note how I can query for displayName directly because we can guarantee that all players have 
    # a metadata field with a `displayName`
    displayName 
    
    ...on HumanMetadata {
      # ...
    }

    ...on RobotMetadata {
      # ...
    }
  } 

  ...on HumanPlayer {
    # ...
  }

  ...on RobotPlayer {
    # ...
  }
}

@mattbaker On the flip-side, because we say HumanPlayer has a metadata field with a concrete type of HumanMetadata, your query would actually be as such (assuming humanPlayer returns a HumanPlayer type)

humanPlayer {
  metadata {
    # any human metadata field
  }
}

This probably doesn't help with the brain-knot untying, but figured I'd give a different perspective on how I interpreted this 😄 . We broke GraphQL

@mattbaker
Copy link
Contributor Author

Yeah, the reason I don't think HumanMetadataInterface is exactly PlayerMetadataInterface is that, to me, the statement that HumanMetadataInterface == PlayerMetadataInterface is an inference.

But! There is a GraphQL reference implementation in JS... 🤔

@jerelmiller — could you recreate your schema in that to see what happens perhaaaaaps? Ellie Poley on our team did this and might even have a skeleton project set up you can use.

@jerelmiller
Copy link
Contributor

There is a GraphQL reference implementation in JS

Now thats a 💯 idea! I'd be happy to do it and see if it allows or not!

@jerelmiller
Copy link
Contributor

So after trying this out using GraphQL.js, it looks like this is a valid use case. @emeryotopalik and I were able to successfully resolve the fields as expected trying various queries.

You can see a reproduction using a like-schema in this repo: https://github.com/jerelmiller/graphql-interface-reproduction

After working with that, it appears that proves this a valid use case, which means that we should patch import_sdl to match the DSL as the DSL is currently implemented correctly.

@mattbaker
Copy link
Contributor Author

Nice find! I'm surprised but it seems like the reference implementation is a good tiebreaker!

Thanks to both of you for trying out the repro 🙏

@mattbaker
Copy link
Contributor Author

mattbaker commented Jul 29, 2021

I'm going to close the PR, mind updating the GitHub issue with the latest findings @jerelmiller?

@mattbaker mattbaker closed this Jul 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants