-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Universal specs for JSON-RPC API methods #217
Comments
👍
Is it possible to include strict verifications, such as a subset of regular expressions on some data fields? Thinking here about validating that an actual address was returned for example by
Have you seen cpp-ethereum's |
This is very exciting to me. Developers need a place they can look up stable methods, and client devs need a smooth way to evolve their interfaces. Other clients that will be nice to add to the tested columns:
And I'm sure others, too. Once we have a strong singular test suite like this, building new RPC providers should become a much more stable process. |
Really great to see this happening, @cdetrio, @FlySwatter! Starting with the RPCs makes sense, and I hope that the patterns which are built can be extended to broader specifications over time. We've had discussions along very similar lines for Enterprise using Boardroom. See also http://html5test.com. |
Ah, here's an issue on that: ethereum/interfaces#4 |
There's also https://github.com/ethcore/ethereum-rpc-json, which generates documentation but doesn't include tests to run against clients. |
Seems like it measures compatibility against Geth, not against any spec. |
Hey @rphmeier. See "Test suite maintenance" at the very top of this issue. The existing tests are incomplete placeholders. The plan is to expand/refine those to become the missing client-neutral executable specification. An automated maintenance process for the RPC test suite will be described in a forthcoming Meta-EIP. The current plan is to maintain updates to the RPC test suite through a voting contract, with one-developer one-vote where developer identities are verified using devcon2 tokens. (There is a separate plan to expand the Devcon2 tokens into a web of trust, in order to enable participation by any and all Ethereum developers/stakeholders). Tentatively named EipSignal, this automated governance process is somewhat in the spirit of the Yellow Paper Committee; however, the outcome of EipSignal votes will only mandate changes to the RPC API specification and test suite as described above, and will NOT govern "core/consensus" protocol changes. The work-in-progress on EipSignal can seen here: MetaMask/IPFS-Ethereum-Hackathon#14. |
@bobsummerwill beat me to it, but I'll still reword:
There is no current spec to measure against, which is why this EIP exists. The current plan is to manage this test suite with a form of governance, at which point non-geth-compliant specs may emerge, or even become dominant. Until then, the linked test suite should be treated as a template for specs, not as a proposed spec itself. |
A test suite isn't a specification, it just tests conformity to some specification. The schema described here is not sufficient to specify certain primitives which all methods rely upon. This is exactly how you get into the situation of strange corner cases that each client implements differently while still being "to-spec". The places where I see the current documentation lacking the most:
I'm glad that this EIP is working on point 3. But we need strong definitions of the primitives used to build the RPC methods rather than specific test cases which will just lead to overfitting. It's misleading to label one specific implementation of ambiguous methods as "correct". |
I think we're actually starting to stray beyond the scope of this issue, but there isn't a better place for it. This issue is really about an RPC test format. We're beginning to talk about the governance process of managing those tests. I understand the initial alarm, it could sound like someone is trying to hijack what "correct" means, but that's not our intention at all. We're not trying to shame Parity over a specific difference from Geth, we're trying to create a tool for all client devs to better coordinate their implementations. The (current, in flux) plan is that a given suite of tests would accompany an EIP spec, not replace it. For example, for a given EIP, a variety of accompanying test suites could be nominated, and the most popular one(s?) could be displayed on the compatibility table. Since the most popular test suite for a given EIP would be displayed, the test would not itself be an endorsement of that EIP spec, it would merely be a representation of it, and a way of automatically measuring compliance to it, which better represents the sort of ad-hoc standards process we actually have, where endorsement is almost solely represented through client implementation. It certainly would not be a spec itself, since the table could include distinct, incompatible implementations of the same method, but it would provide a singular place to view and compare those incompatibilities. This table would not be about labeling any one implementation as correct, but instead, be a way of highlighting differences of implementation, for the benefit of both dapp and client developers. I don't think a test suite has to encourage over-fitting, either. In the test format that @cdetrio has proposed, specific values of the response can be used to validate results, so it does not need to blindly enforce bit-conformity. In the case of I'm very interested in seeing solutions to 1 & 2 as well, and while a test suite may not be a complete solution for them, I think it may still be a useful tool for even those issues, when combined with EIP specifications. |
@rphmeier Good points, I agree that pending state and error messages are two important areas to address. EIP #136 addresses RPC error messages, and I think the schema here is sufficient to incorporate those error messages. Pending state is trickier. Do you have any suggestions for how to define the pending state as a primitive? I'm still unsure about this, but one thought is that perhaps an additional testing-specific RPC method would be useful, such as a On the third point, "lack of consensus between client devs over changes", that will be addressed by a separate EIP/meta-EIP, which will propose a governance/maintenance process. However, I wouldn't mind bootstrapping it with an initial test suite of "compatible/correct" behavior, determined informally by pre-existing standards and rough consensus. The current examples were generated against geth (two incompatible geth branches, in fact), but they are certainly only meant as examples. The discussion here should focus on the spec or test schema/format. The term "spec" is used loosely, in the sense of an API spec, where the aim is to enable black-box testing of RPC method behavior. We want to settle on a format that is sufficient to specify behavior (at least implicitly if not explicitly) that is currently ambiguous, and test compliance. |
What I see as a major hole in these tests is that they don't appear to have a mechanism for setting up starting state. What these tests really should be testing is along the lines of:
This means tests necessarily need to be more complex than a simple "given request, expect result". One way to achieve this (pseudocode only):
Alternatively, the test could explicitly setup expected state. For example:
Then the test would call a one or more RPC methods and assert that it gets back a very well defined response. The key here is that all tests should setup the current full chain state, then call a particular RPC method and assert on the results. Building a test suite like this is expensive, but is also valuable for multi-implementation compatibility. Note that this test doesn't care how an implementation achieves the result (implementation details), but it is well defined that given a particular state and an RPC call, an expected outcome is achieved. As to how to codify "setup", I'll leave that up for discussion but I do think it is critical that COMPLETE setup is included in the test. The test should assume nothing about the current state of the world when it executes. Personally, I'm a fan of the first style because it is generally easier to write tests when the test defines the path to the expected state, rather than requiring that the thing under test has the ability to set the current state. The first mechanism also allows true black box testing, where the test can be run against an endpoint without knowing its implementation. The second mechanism requires that the test have a back-door mechanism to setup current state. |
[It has been brought to my attention](MetaMask/metamask-extension#1557) that this (and in turn MetaMask's) personal_sign param ordering is incorrect. How did this happen? Was it [the lack of a standard RPC test suite?](ethereum/EIPs#217) was it just my mistake? I'll claim both. The Wiki seems to be accurate in correcting us here, too: https://github.com/ethereum/go-ethereum/wiki/Management-APIs#personal_sign
[It has been brought to my attention](MetaMask/metamask-extension#1557) that this (and in turn MetaMask's) personal_sign param ordering is incorrect. How did this happen? Was it [the lack of a standard RPC test suite?](ethereum/EIPs#217) was it just my mistake? I'll claim both. The Wiki seems to be accurate in correcting us here, too: https://github.com/ethereum/go-ethereum/wiki/Management-APIs#personal_sign
I just now stumled upon this when searching for something like swagger to define and generate documentation for our json-rpc apis. Unfortunately, swagger isn't for json-rpc. @cdetrio Have you found any good editors to write/generate the specifications above? Or did you do it by hand? I'd really like to have not only a specification, but also an editor which makes writing/validating the specification intuitive and simple. If it involves writing files by hand, then running some npm script to validate and generate docs based on our custom specification file, then I'm afraid it will just bitrot after a while. |
Totally up for this. |
@holiman I think I'd like to pick this up on my team. My thought was that we could add these schemas to the |
Do it, @pipermerriam. That would be awesome! When I spoke at the launch event for the EEA back in February 2017, 17 months ago, this was actually one of the areas of protocol specification which I highlighted as being most needed and highest impact - both for public Ethereum and for enterprise scenarios. The lack of clear and testable JSON-RPC specifications really hurts everyone trying to build on top of Ethereum. What are PegaSys doing for RPCs in Pantheon, @shahankhatch? Are you just trying to be close to Geth? Or did you do some modal thing like Parity have? |
@rphmeier What do you see as the best path towards standardization of RPC specs across the various Ethereum clients, Robert? As you said last year, there is a balance here to ensure that we don't "over-fit", and to ensure that any automated tests that we build are encoding unambiguous original specifications. Any thoughts, @gavofyork? |
retesteth is capable of running response requests check against clients via RPC |
There has been no activity on this issue for two months. It will be closed in a week if no further activity occurs. If you would like to move this EIP forward, please respond to any outstanding feedback or add a comment indicating that you have addressed all required feedback and are ready for a review. |
This issue was closed due to inactivity. If you are still pursuing it, feel free to reopen it and respond to any feedback or request a review in a comment. |
Universal specs for JSON-RPC API methods
This is a pre-draft outline of a Hive-based multi-client test suite for the JSON-RPC API. The specification format is JSON schema (inspired by the Kodi API description).
Existing RPC test suites include ethereum/rpc-tests and bas-vk/hive/tree/rpc. The former is based on mocha.js and the latter on go-ethereum's
ethclient.Client
api. This proposal intends to be a language and client-agnostic test and specification format.Here's an example JSON schema specification, for eth_getBlockTransactionCountByHash, annotated for explanation:
The
description
fields are not used for validation, but for generating RPC API method documentation as seen on the wiki.The JSON schema specifications are paired with test cases. Here's an example file with test cases for eth_getBlockTransactionCountByHash:
The
asserts
array specifies jq programs or "filters", which are used to compare the received response against the expected response.The
chainConfig
option points to a set of blocks/transactions to import. The current chain configuration is inherited from ethereum/rpc-tests, which in turn borrows from the state tests repo. This configuration file is loaded by the test runner, and imported by the hive clients.After hive runs the RPC tests against a set of clients, the results are displayed as a compatibility table: http://cdetrio.github.io/eth-compat-table/ (inspired by the table for ES6).
Todos
One remaining task is to specify a test format for the pub/sub methods. One possibility is a
transitionEvent
field to specify an event (e.g. arrival of a block or a pending transaction), and anonTransition
field to describe the expected notifications.Test suite maintenance
An automated maintenance process for the RPC test suite will be described in a forthcoming Meta-EIP. The current plan is to maintain updates to the RPC test suite through a voting contract, with one-developer one-vote where developer identities are verified using devcon2 tokens. (There is a separate plan to expand the Devcon2 tokens into a web of trust, in order to enable participation by any and all Ethereum developers/stakeholders). Tentatively named
EipSignal
, this automated governance process is somewhat in the spirit of the Yellow Paper Committee; however, the outcome ofEipSignal
votes will only mandate changes to the RPC API specification and test suite as described above, and will NOT govern "core/consensus" protocol changes. The work-in-progress onEipSignal
can seen here: MetaMask/IPFS-Ethereum-Hackathon#14.The text was updated successfully, but these errors were encountered: