Skip to content

Commit

Permalink
SDK test suites (#258)
Browse files Browse the repository at this point in the history
* Simple SDK test harness; SDK test module

Yet to come: actual SDK tests.

* Quiet clippy lint about "too many arguments"

* Lints are named with underscores, not dashes...

* Will this make clippy shut up?

* Go nuclear on disabling `too_many_attributes` lint. Sigh.

* WIP SDK test client, and fix bugs in Rust codegen

Compiling the module_bindings for the sdk-test module revealed two bugs:

- Enums holding structs generated incorrectly,
  unpacking the struct into the enum's payload.

- Recursive types would cause the codegen to attempt to recursively import
  the current module into itself.

* One (1) actual runnable test in the sdk crate

* Exclude test-client from CI

The sdk tests already build this crate (though they don't clippy or fmt it).

Attempting to build, test, fmt or clippy it as-is will fail
because the module_bindings are not committed.
This is intentional, as the SDK test suite wants to generate the module_bindings
during its run.

* Rustfmt ignore generated module_bindings

It turns out `cargo fmt` doesn't actually support the `--exclude` option
the way `cargo clippy` does.

Instead, just `#[rustfmt::skip]` the `mod module_bindings;` decl.

* Actually commit test file...

God, I'm so bad at remembering to commit new files.

Anyway, add a test for deleting rows with primitive unique fields.

* Make CLI tool available in tests workflow

The SDK tests need to run `spacetime start`, `spacetime generate` and `spacetime publish`.

* Test update events with primitive pk types; split test-client into files

* Tests with `Identity` fields in tables

* Tests for reducer callbacks, both successful and failing

* Tests with vecs of stuff, with structs, with enums

* Test that should fail, test that uses a large table with many columns

* Test for resubscribe functionality

* Test of reauth; fix major bug in `TestCounter`

I misread `Condvar::wait_timeout_while` as `Condvar::wait_timeout_until`,
and flipped my predicate.
This led to false negatives (i.e. tests that passed that shouldn't have).

* A fistful of doc comments

* Avoid race condition running multiple tests with same client project

This commit fixes a race condition which sometimes caused the SDK tests to fail
because multiple `spacetime generate` processes running concurrently
would clobber each others' output,
potentially deleting it while a `cargo build` or `cargo run` process was running.

Now, the test harness will only run `spacetime generate` at most once
for any given directory.

* Add env_logger to SDK test client

* RUST_LOG=trace when running test clients

* quieter logs in test client: only warn-level and higher
  • Loading branch information
gefjon authored Sep 12, 2023
1 parent c539e0a commit 32ac808
Show file tree
Hide file tree
Showing 22 changed files with 3,327 additions and 29 deletions.
8 changes: 6 additions & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,13 +30,17 @@ jobs:

- uses: dsherret/rust-toolchain-file@v1

- name: Install CLI tool
run: |
cargo install --path crates/cli

This comment has been minimized.

Copy link
@RReverser

RReverser Sep 13, 2023

Member

One problem with this approach is that it relies on global spacetimedb command being the latest spacetimedb-cli. This is true in the CI, but is likely not true when running tests locally.

spacetimedb-cli also can be included as a library dependency and then you can invoke commands via its public API - could we do that in the testing crate instead? That way it would always be up-to-date and wouldn't clobber the globally installed binary.

This comment has been minimized.

Copy link
@RReverser

RReverser Sep 13, 2023

Member

OTOH quite a bit in the testing framework depends on this being a separate process. Perhaps a better approach is to move binary compilation to build.rs.

This comment has been minimized.

Copy link
@gefjon

gefjon Sep 13, 2023

Author Contributor

I think it's conceptually possible to run both the Spacetime server and the CLI commands inside the harness process. The plumbing on the former is a bit challenging, as you need a way to spawn a single global instance shared between all tests, but in some ways simpler than what I currently have, as you don't have to worry about closing it before exiting the harness process. The latter is potentially more of a challenge, as the CLI commands are written directly in terms of clap::ArgMatches, but that's not insurmountable.

Building the CLI in build.rs and stashing it in some known location relative to the testing crate is probably easier, though. IMO, actually invoking the CLI makes for more effective integration testing, which is why I wrote the tests to do that.

- name: Create /stdb dir
run: |
sudo mkdir /stdb
sudo chmod 777 /stdb
- name: Run cargo test
run: cargo test --all --features odb_rocksdb,odb_sled
run: cargo test --all --exclude test-client --features odb_rocksdb,odb_sled

lints:
name: Lints
Expand All @@ -52,7 +56,7 @@ jobs:
run: cargo fmt --all -- --check

- name: Run cargo clippy
run: cargo clippy --all --tests --features odb_rocksdb,odb_sled -- -D warnings
run: cargo clippy --all --exclude test-client --tests --features odb_rocksdb,odb_sled -- -D warnings

wasm_bindings:
name: Build and test wasm bindings
Expand Down
35 changes: 35 additions & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 3 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@ members = [
"modules/benchmarks",
"modules/spacetimedb-quickstart",
"modules/quickstart-chat",
"modules/sdk-test",
"crates/sdk/tests/test-client",
]
default-members = ["crates/cli"]
# cargo feature graph resolver. v2 is default in edition2021 but workspace
Expand Down Expand Up @@ -79,6 +81,7 @@ dirs = "5.0.1"
duct = "0.13.5"
email_address = "0.2.4"
enum-as-inner = "0.6"
env_logger = "0.10"
flate2 = "1.0.24"
fs2 = "0.4.3"
fs-err = "2.9.0"
Expand Down
10 changes: 3 additions & 7 deletions crates/cli/src/subcommands/generate/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -226,24 +226,20 @@ impl GenItem {
match self {
GenItem::Table(table) => {
let code = rust::autogen_rust_table(ctx, table);
let name = table.name.to_case(Case::Snake);
Some((name + ".rs", code))
Some((rust::rust_type_file_name(&table.name), code))
}
GenItem::TypeAlias(TypeAlias { name, ty }) => {
let filename = name.replace('.', "").to_case(Case::Snake);
let filename = filename + ".rs";
let code = match &ctx.typespace[*ty] {
AlgebraicType::Sum(sum) => rust::autogen_rust_sum(ctx, name, sum),
AlgebraicType::Product(prod) => rust::autogen_rust_tuple(ctx, name, prod),
_ => todo!(),
};
Some((filename, code))
Some((rust::rust_type_file_name(name), code))
}
GenItem::Reducer(reducer) if reducer.name == "__init__" => None,
GenItem::Reducer(reducer) => {
let code = rust::autogen_rust_reducer(ctx, reducer);
let name = reducer.name.to_case(Case::Snake);
Some((name + "_reducer.rs", code))
Some((rust::rust_reducer_file_name(&reducer.name), code))
}
}
}
Expand Down
65 changes: 48 additions & 17 deletions crates/cli/src/subcommands/generate/rust.rs
Original file line number Diff line number Diff line change
Expand Up @@ -199,10 +199,15 @@ pub fn autogen_rust_sum(ctx: &GenCtx, name: &str, sum_type: &SumType) -> String

print_file_header(out);

// Pass this file into `gen_and_print_imports` to avoid recursively importing self
// for recursive types.
let file_name = name.to_case(Case::Snake);
let this_file = (file_name.as_str(), name);

// For some reason, deref coercion doesn't work on `&sum_type.variants` here - rustc
// wants to pass it as `&Vec<_>`, not `&[_]`. The slicing index `[..]` forces passing
// as a slice.
gen_and_print_imports(ctx, out, &sum_type.variants[..], generate_imports_variants);
gen_and_print_imports(ctx, out, &sum_type.variants[..], generate_imports_variants, this_file);

out.newline();

Expand Down Expand Up @@ -239,16 +244,6 @@ fn write_enum_variant(ctx: &GenCtx, out: &mut Indenter, variant: &SumTypeVariant
// ```
writeln!(out, ",").unwrap();
}
AlgebraicType::Product(ProductType { elements }) => {
// If the contained type is a non-empty product, i.e. this variant is
// struct-like, write it with braces and named fields.
write_struct_type_fields_in_braces(
ctx, out, elements,
// Do not `pub`-qualify fields because enum fields are always public, and
// rustc errors on the redundant `pub`.
false,
);
}
otherwise => {
// If the contained type is not a product, i.e. this variant has a single
// member, write it tuple-style, with parens.
Expand Down Expand Up @@ -352,6 +347,16 @@ pub fn autogen_rust_table(ctx: &GenCtx, table: &TableDef) -> String {
// - Complicated because `HashMap` is not `Hash`.
// - others?

pub fn rust_type_file_name(type_name: &str) -> String {
let filename = type_name.replace('.', "").to_case(Case::Snake);
filename + ".rs"
}

pub fn rust_reducer_file_name(type_name: &str) -> String {
let filename = type_name.replace('.', "").to_case(Case::Snake);
filename + "_reducer.rs"
}

const STRUCT_DERIVES: &[&str] = &["#[derive(Serialize, Deserialize, Clone, PartialEq, Debug)]"];

fn print_struct_derives(output: &mut Indenter) {
Expand All @@ -363,7 +368,15 @@ fn begin_rust_struct_def_shared(ctx: &GenCtx, out: &mut Indenter, name: &str, el

print_spacetimedb_imports(out);

gen_and_print_imports(ctx, out, elements, generate_imports_elements);
// Pass this file into `gen_and_print_imports` to avoid recursively importing self
// for recursive types.
//
// The file_name will be incorrect for reducer arg structs, but that doesn't matter
// because it's impossible for a reducer arg struct to be recursive.
let file_name = name.to_case(Case::Snake);
let this_file = (file_name.as_str(), name);

gen_and_print_imports(ctx, out, elements, generate_imports_elements, this_file);

out.newline();

Expand Down Expand Up @@ -1099,17 +1112,35 @@ fn generate_imports(ctx: &GenCtx, imports: &mut Imports, ty: &AlgebraicType) {
}
}

fn print_imports(out: &mut Indenter, imports: Imports) {
/// Print `use super::` imports for each of the `imports`, except `this_file`.
///
/// `this_file` is passed and excluded for the case of recursive types:
/// without it, the definition for a type like `struct Foo { foos: Vec<Foo> }`
/// would attempt to include `import super::foo::Foo`, which fails to compile.
fn print_imports(out: &mut Indenter, imports: Imports, this_file: (&str, &str)) {
for (module_name, type_name) in imports {
writeln!(out, "use super::{}::{};", module_name, type_name).unwrap();
if (module_name.as_str(), type_name.as_str()) != this_file {
writeln!(out, "use super::{}::{};", module_name, type_name).unwrap();
}
}
}

fn gen_and_print_imports<Roots, SearchFn>(ctx: &GenCtx, out: &mut Indenter, roots: Roots, search_fn: SearchFn)
where
/// Use `search_function` on `roots` to detect required imports, then print them with `print_imports`.
///
/// `this_file` is passed and excluded for the case of recursive types:
/// without it, the definition for a type like `struct Foo { foos: Vec<Foo> }`
/// would attempt to include `import super::foo::Foo`, which fails to compile.
fn gen_and_print_imports<Roots, SearchFn>(
ctx: &GenCtx,
out: &mut Indenter,
roots: Roots,
search_fn: SearchFn,
this_file: (&str, &str),
) where
SearchFn: FnOnce(&GenCtx, &mut Imports, Roots),
{
let mut imports = HashSet::new();
search_fn(ctx, &mut imports, roots);
print_imports(out, imports);

print_imports(out, imports, this_file);
}
4 changes: 4 additions & 0 deletions crates/sdk/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,11 @@ tokio.workspace = true
tokio-tungstenite.workspace = true

[dev-dependencies]
# for quickstart-chat and cursive-chat examples
hex.workspace = true
# for cursive-chat example
cursive.workspace = true
futures-channel.workspace = true

# for tests
spacetimedb-testing = { path = "../testing" }
13 changes: 12 additions & 1 deletion crates/sdk/src/client_cache.rs
Original file line number Diff line number Diff line change
Expand Up @@ -203,6 +203,12 @@ impl<T: TableType> TableCache<T> {
);

for (row_hash, row) in prev_subs.into_iter() {
log::trace!(
"Initalizing table {:?}: row previously resident: {:?} hash: {:?}",
T::TABLE_NAME,
row,
row_hash,
);
if diff.insert(row_hash, DiffEntry::Delete(row)).is_some() {
// This should be impossible, but just in case...
log::error!("Found duplicate row in existing `TableCache` for {:?}", T::TABLE_NAME);
Expand Down Expand Up @@ -238,7 +244,12 @@ impl<T: TableType> TableCache<T> {
);
}
Ok(row) => {
log::trace!("Initializing table {:?}: got new row {:?}", T::TABLE_NAME, row);
log::trace!(
"Initializing table {:?}: got new row {:?}. Hash: {:?}",
T::TABLE_NAME,
row,
row_pk
);
diff.insert(row_pk, DiffEntry::Insert(row));
}
},
Expand Down
12 changes: 12 additions & 0 deletions crates/sdk/tests/test-client/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
[package]
name = "test-client"
version = "0.1.0"
edition = "2021"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[dependencies]
spacetimedb-sdk = { path = "../.." }
tokio.workspace = true
anyhow.workspace = true
env_logger.workspace = true
1 change: 1 addition & 0 deletions crates/sdk/tests/test-client/src/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
module_bindings
Loading

1 comment on commit 32ac808

@github-actions
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Benchmark for 32ac808

Click to view benchmark
Test Base PR %
empty reducer 27.8±1.14µs 24.4±1.37µs -12.23%
filter nonunique random/100 36.5±1.45µs 37.0±1.71µs +1.37%
filter nonunique random/1000 38.5±1.91µs 35.6±3.11µs -7.53%
filter nonunique random/10000 39.5±1.79µs 38.7±2.84µs -2.03%
filter nonunique sequential/100 35.5±2.36µs 34.7±1.69µs -2.25%
filter nonunique sequential/1000 39.6±3.68µs 38.3±2.04µs -3.28%
filter nonunique sequential/10000 38.1±3.69µs 37.2±2.58µs -2.36%
filter nonunique sequential/100000 39.7±3.17µs 39.0±2.73µs -1.76%
filter unique random/100 33.6±1.94µs 33.1±1.92µs -1.49%
filter unique random/1000 33.9±1.83µs 32.8±1.36µs -3.24%
filter unique random/10000 33.7±2.10µs 33.1±2.69µs -1.78%
filter unique sequential/100 34.2±1.48µs 33.5±1.28µs -2.05%
filter unique sequential/1000 34.5±2.50µs 34.4±1.30µs -0.29%
filter unique sequential/10000 35.1±1.85µs 33.7±1.66µs -3.99%
filter unique sequential/100000 35.3±2.48µs 35.9±1.93µs +1.70%
iterator/1_000_000 rows 541.6±1.97ms 543.9±2.86ms +0.42%
large input 1326.3±214.35µs 1317.2±276.42µs -0.69%
multi insert/10 67.4±7.04µs 61.7±3.97µs -8.46%
multi insert/100 308.4±5.41µs 304.8±7.99µs -1.17%
multi insert/1000 2.8±0.05ms 2.8±0.03ms 0.00%
multi insert/50 166.3±5.33µs 167.3±11.04µs +0.60%
multi insert/500 1472.0±10.70µs 1422.5±13.81µs -3.36%
multiple large arguments 12.6±0.10ms 12.3±0.09ms -2.38%
single insert 54.5±4.91µs 52.8±4.33µs -3.12%
with existing records/100000 52.1±4.63µs 49.5±5.93µs -4.99%
with existing records/1000000 61.3±3.90µs 60.7±4.04µs -0.98%
with existing records/200000 58.5±1.84µs 52.5±4.52µs -10.26%
with existing records/300000 59.4±5.17µs 52.4±4.70µs -11.78%
with existing records/400000 60.1±2.54µs 58.3±6.45µs -3.00%
with existing records/500000 56.6±4.45µs 58.8±2.16µs +3.89%
with existing records/600000 59.0±4.32µs 56.4±3.59µs -4.41%
with existing records/700000 59.1±3.46µs 56.4±4.05µs -4.57%
with existing records/800000 59.8±4.31µs 58.1±4.21µs -2.84%
with existing records/900000 61.0±2.92µs 55.0±5.73µs -9.84%

Please sign in to comment.