Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add broken test cases for spectests #431

Merged
merged 2 commits into from
Jul 21, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions test/smoketests/spectests/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,16 @@ set_tests_properties(
PASS_REGULAR_EXPRESSION "PASSED 4, FAILED 18, SKIPPED 2"
)

add_test(
NAME fizzy/smoketests/spectests/broken
COMMAND fizzy-spectests ${CMAKE_CURRENT_LIST_DIR}/broken
)
set_tests_properties(
fizzy/smoketests/spectests/broken
PROPERTIES
PASS_REGULAR_EXPRESSION "PASSED 0, FAILED 2, SKIPPED 4"
)

add_test(
NAME fizzy/smoketests/spectests/cli-missing-dir-arg
COMMAND fizzy-spectests
Expand All @@ -58,6 +68,7 @@ set_tests_properties(
fizzy/smoketests/spectests/default
fizzy/smoketests/spectests/skipvalidation
fizzy/smoketests/spectests/failures
fizzy/smoketests/spectests/broken
fizzy/smoketests/spectests/cli-missing-dir-arg
fizzy/smoketests/spectests/cli-invalid-arg
PROPERTIES
Expand Down
8 changes: 8 additions & 0 deletions test/smoketests/spectests/broken/broken.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
{"source_filename": "broken.wast",
"commands": [
{"type": "invalid_command", "line": 0},
{"type": "module", "line": 2, "filename": "missing_file.wasm"},
{"type": "module", "line": 5, "filename": "unparsable.wasm"},
{"type": "assert_return", "line": 10, "action": {"type": "get", "field": "tmp"}, "expected": []},
{"type": "assert_return", "line": 15, "action": {"type": "invalid_action_type"}, "expected": []},
{"type": "assert_trap", "line": 20, "action": {"type": "invalid_action_type"}, "expected": []}]}
1 change: 1 addition & 0 deletions test/smoketests/spectests/broken/unparsable.wasm
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
invalid
19 changes: 5 additions & 14 deletions test/spectests/spectests.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -576,29 +576,20 @@ bool run_tests_from_dir(const fs::path& path, const test_settings& settings)
std::sort(std::begin(files), std::end(files));

test_results total;
bool exception_thrown = false;
for (const auto& f : files)
{
try
{
const auto res = test_runner{settings}.run_from_file(f);
const auto res = test_runner{settings}.run_from_file(f);

total.passed += res.passed;
total.failed += res.failed;
total.skipped += res.skipped;
}
catch (const std::exception& ex)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

main has the very same catch all.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without it, it won't outuput TOTAL summary string in case of unexpected exception, but perhaps it's not important.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think if there's an exception we shouldn't try to recover, apart from displaying the exception.

{
std::cerr << "Exception: " << ex.what() << "\n\n";
exception_thrown = true;
}
total.passed += res.passed;
total.failed += res.failed;
total.skipped += res.skipped;
}

std::cout << "TOTAL " << (total.passed + total.failed + total.skipped) << " tests ran from "
<< path << ".\n PASSED " << total.passed << ", FAILED " << total.failed
<< ", SKIPPED " << total.skipped << ".\n";

return (total.failed == 0 && !exception_thrown);
return total.failed == 0;
}

} // namespace
Expand Down