Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extract Syntax tests #3644

Closed
5 tasks done
chriseth opened this issue Mar 5, 2018 · 0 comments
Closed
5 tasks done

Extract Syntax tests #3644

chriseth opened this issue Mar 5, 2018 · 0 comments
Assignees

Comments

@chriseth
Copy link
Contributor

chriseth commented Mar 5, 2018

This is a sub-task of #3486 which only handles syntax test (this includes what is now NameAndTypeResolutionTests and ParserTests).

Each test case is an individual file in a (potentially multi-level) directory hierarchy under /test/syntaxTests. The files have the following format:

code
// ----
// Error1: ...
// Error2: ...

code is the code to be compiled. The part after \n// --\n is the test expectation. It is a list of error types followed by error messages.

The test is correct if compiling the code until after the analysis phase yields exactly the given list of errors (with multiplicities, perhaps even in exactly that order).

The tests should be run as part of the usual soltest run. This means that we might need to specify the top-level data file directory for soltest and then soltest dynamically adds the test files to the boost test hierarchy and should also result in the appropriate test result reports.

There could be a new flag to soltest called --update which causes the following behaviour:

On each failing test (of this new kind), the source code and the differences between test result and expectation is displayed. The user is asked to either

  • accept the test result as new expectation (this causes the data file to be changed by soltest)
  • ignore the failure or
  • open an editor to modify the source file.

@todo: When do we automatically add the version pragma?
@todo: How do we parameterize mulitple evm versions?

Tasks:

  • parser for test file format
  • module that traverses the directories and populates the boost test tree
  • module that runs the tests, performs the comparison and can also create a message to display with the boost test failure (this message should already contain the difference in expectation, but not the source).
  • module that performs the interactive update
  • gradual conversion of tests into the new format (this is not possible for all test)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants