Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_paginated_single_column_descending is flaky #540

Closed
ahl opened this issue Dec 21, 2021 · 16 comments · Fixed by #553
Closed

test_paginated_single_column_descending is flaky #540

ahl opened this issue Dec 21, 2021 · 16 comments · Fixed by #553
Labels
Test Flake Tests that work. Wait, no. Actually yes. Hang on. Something is broken. Testing & Analysis Tests & Analyzers

Comments

@ahl
Copy link
Contributor

ahl commented Dec 21, 2021

See: https://buildomat.eng.oxide.computer/wg/0/details/01FQF2458RQ7G2VFKJRVBRRZK9/rdBWzehf1zRLzuNhI9iAokOFh33iGAWV7WqKZQNhv4eh5F4g/01FQF24K9BW4YFS4YKB2P2SMD2



3154 | 2021-12-21T18:24:03.885Z | ---- db::pagination::test::test_paginated_single_column_descending stdout ----
-- | -- | --
3155 | 2021-12-21T18:24:03.891Z | log file: "/tmp/omicron_nexus-af7cc18609452f5b-test_paginated_single_column_descending.3657.7.log"
3156 | 2021-12-21T18:24:03.896Z | note: configured to log to "/tmp/omicron_nexus-af7cc18609452f5b-test_paginated_single_column_descending.3657.7.log"
3157 | 2021-12-21T18:24:03.902Z | thread 'db::pagination::test::test_paginated_single_column_descending' panicked at 'called `Result::unwrap()` on an `Err` value: Exited', /work/oxidecomputer/omicron/test-utils/src/dev/mod.rs:119:42
3158 | 2021-12-21T18:24:03.908Z | note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
@smklein smklein added Test Flake Tests that work. Wait, no. Actually yes. Hang on. Something is broken. Testing & Analysis Tests & Analyzers labels Dec 22, 2021
@davepacheco
Copy link
Collaborator

FWIW, this seems to affect many tests (presumably all of them) that create a test database.

@davepacheco
Copy link
Collaborator

We're blowing the assertion here in setup_database():

118     info!(&log, "cockroach command line: {}", starter.cmdline());
119     let database = starter.start().await.unwrap();

We're unexpectedly getting CockroachStartError::Exited while trying to start CockroachDB. That variant is only used here:

CockroachStartError::Exited,

At this point, we've forked cockroach and we're waiting for it to write out its listen-url file. But instead, it exits.

I think this may be a slightly different failure mode than we saw last week but I'm not positive.

@davepacheco
Copy link
Collaborator

Of course, we'd like to know what cockroachdb emitted to stdout and stderr when this happened. Those outputs are redirected to files in the temporary directory. While we go out of our way to preserve the temporary directory in the case where we timed out waiting for CockroachDB, we don't do that when CockroachDB just exits. I'm fixing this in #545.

@davepacheco
Copy link
Collaborator

With #545 (really, from commit f6b7da2 in that PR), I attempted to repro this using:

$ for (( i = 0; ; i++ )) { echo "attempt: $(( i + 1 ))"; cargo test -p omicron-nexus || break; }
...
failures:

---- integration_tests::instances::test_instances_delete_fails_when_running_succeeds_when_stopped stdout ----
log file: "/dangerzone/omicron_tmp/test_all-b54c2bba3f34f365-test_instances_delete_fails_when_running_succeeds_when_stopped.18213.18.log"
note: configured to log to "/dangerzone/omicron_tmp/test_all-b54c2bba3f34f365-test_instances_delete_fails_when_running_succeeds_when_stopped.18213.18.log"
thread 'integration_tests::instances::test_instances_delete_fails_when_running_succeeds_when_stopped' panicked at 'called `Result::unwrap()` on an `Err` value: Exited', /home/dap/omicron/test-utils/src/dev/mod.rs:119:42
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace


failures:
    integration_tests::instances::test_instances_delete_fails_when_running_succeeds_when_stopped

test result: FAILED. 39 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 8.84s

error: test failed, to rerun pass '-p omicron-nexus --test test_all'

It took 54 attempts. I wasn't tracking time, but it was less than an hour.

$ echo $i
54

If we follow the log file:

$ bunyan /dangerzone/omicron_tmp/test_all-b54c2bba3f34f365-test_instances_delete_fails_when_running_succeeds_when_stopped.18213.18.log
[2021-12-22T10:49:30.022977969-08:00]  INFO: test_instances_delete_fails_when_running_succeeds_when_stopped/18213 on ivanova: cockroach temporary directory: /dangerzone/omicron_tmp/.tmp1nmMP7
[2021-12-22T10:49:30.023336787-08:00]  INFO: test_instances_delete_fails_when_running_succeeds_when_stopped/18213 on ivanova: cockroach: copying from seed directory
[2021-12-22T10:49:30.040381468-08:00]  INFO: test_instances_delete_fails_when_running_succeeds_when_stopped/18213 on ivanova: cockroach command line: cockroach start-single-node --insecure --http-addr=:0 --store /dangerzone/omicron_tmp/.tmp1nmMP7/data --listen-addr 127.0.0.1:0 --listening-url-file /dangerzone/omicron_tmp/.tmp1nmMP7/listen-url

it points us to the temporary directory:

$ ls -l /dangerzone/omicron_tmp/.tmp1nmMP7/
total 10
-rw-r--r--   1 dap      staff       8976 Dec 22 10:49 cockroachdb_stderr
-rw-r--r--   1 dap      staff          0 Dec 22 10:49 cockroachdb_stdout
drwxr-xr-x   4 dap      staff         18 Dec 22 10:49 data/

Finally we have some stderr!

$ cat /dangerzone/omicron_tmp/.tmp1nmMP7/cockroachdb_stderr 
runtime: marked free object in span 0xfffffc7feea2b2a0, elemsize=208 freeindex=1 (bad use of unsafe.Pointer? try -d=checkptr)
0xc000c16000 alloc marked  
0xc000c160d0 free  unmarked
0xc000c161a0 free  unmarked
0xc000c16270 free  unmarked
0xc000c16340 alloc unmarked
0xc000c16410 free  unmarked
0xc000c164e0 free  unmarked
0xc000c165b0 free  marked   zombie
000000c000c165b0:  0000000000000000  0000000000000000 
000000c000c165c0:  0000000000000000  0000000000000000 
000000c000c165d0:  0000000000000000  0000000000000000 
000000c000c165e0:  0000000000000000  0000000000000000 
000000c000c165f0:  0000000000000000  0000000000000000 
000000c000c16600:  0000000000000000  0000000000000000 
000000c000c16610:  0000000000000000  0000000000000000 
000000c000c16620:  0000000000000000  0000000000000000 
000000c000c16630:  0000000000000000  0000000000000000 
000000c000c16640:  0000000000000000  0000000000000000 
000000c000c16650:  0000000000000000  0000000000000000 
000000c000c16660:  0000000000000000  0000000000000000 
000000c000c16670:  0000000000000000  0000000000000000 
0xc000c16680 alloc marked  
0xc000c16750 alloc marked  
0xc000c16820 alloc unmarked
0xc000c168f0 free  unmarked
0xc000c169c0 alloc marked  
0xc000c16a90 free  marked   zombie
000000c000c16a90:  0000000000000000  0000000000000000 
000000c000c16aa0:  0000000000000000  0000000000000000 
000000c000c16ab0:  0000000000000000  0000000000000000 
000000c000c16ac0:  0000000000000000  0000000000000000 
000000c000c16ad0:  0000000000000000  0000000000000000 
000000c000c16ae0:  0000000000000000  0000000000000000 
000000c000c16af0:  0000000000000000  0000000000000000 
000000c000c16b00:  0000000000000000  0000000000000000 
000000c000c16b10:  0000000000000000  0000000000000000 
000000c000c16b20:  0000000000000000  0000000000000000 
000000c000c16b30:  0000000000000000  0000000000000000 
000000c000c16b40:  0000000000000000  0000000000000000 
000000c000c16b50:  0000000000000000  0000000000000000 
0xc000c16b60 free  unmarked
0xc000c16c30 free  marked   zombie
000000c000c16c30:  0000000000000000  0000000000000000 
000000c000c16c40:  0000000000000000  0000000000000000 
000000c000c16c50:  0000000000000000  0000000000000000 
000000c000c16c60:  0000000000000000  0000000000000000 
000000c000c16c70:  0000000000000000  0000000000000000 
000000c000c16c80:  0000000000000000  0000000000000000 
000000c000c16c90:  0000000000000000  0000000000000000 
000000c000c16ca0:  0000000000000000  0000000000000000 
000000c000c16cb0:  0000000000000000  0000000000000000 
000000c000c16cc0:  0000000000000000  0000000000000000 
000000c000c16cd0:  0000000000000000  0000000000000000 
000000c000c16ce0:  0000000000000000  0000000000000000 
000000c000c16cf0:  0000000000000000  0000000000000000 
0xc000c16d00 free  unmarked
0xc000c16dd0 free  unmarked
0xc000c16ea0 free  marked   zombie
000000c000c16ea0:  0000000000000000  0000000000000000 
000000c000c16eb0:  0000000000000000  0000000000000000 
000000c000c16ec0:  0000000000000000  0000000000000000 
000000c000c16ed0:  0000000000000000  0000000000000000 
000000c000c16ee0:  0000000000000000  0000000000000000 
000000c000c16ef0:  0000000000000000  0000000000000000 
000000c000c16f00:  0000000000000000  0000000000000000 
000000c000c16f10:  0000000000000000  0000000000000000 
000000c000c16f20:  0000000000000000  0000000000000000 
000000c000c16f30:  0000000000000000  0000000000000000 
000000c000c16f40:  0000000000000000  0000000000000000 
000000c000c16f50:  0000000000000000  0000000000000000 
000000c000c16f60:  0000000000000000  0000000000000000 
0xc000c16f70 alloc marked  
0xc000c17040 free  marked   zombie
000000c000c17040:  0000000000000000  0000000000000000 
000000c000c17050:  0000000000000000  0000000000000000 
000000c000c17060:  0000000000000000  0000000000000000 
000000c000c17070:  0000000000000000  0000000000000000 
000000c000c17080:  0000000000000000  0000000000000000 
000000c000c17090:  0000000000000000  0000000000000000 
000000c000c170a0:  0000000000000000  0000000000000000 
000000c000c170b0:  0000000000000000  0000000000000000 
000000c000c170c0:  0000000000000000  0000000000000000 
000000c000c170d0:  0000000000000000  0000000000000000 
000000c000c170e0:  0000000000000000  0000000000000000 
000000c000c170f0:  0000000000000000  0000000000000000 
000000c000c17100:  0000000000000000  0000000000000000 
0xc000c17110 alloc unmarked
0xc000c171e0 free  marked   zombie
000000c000c171e0:  0000000000000000  0000000000000000 
000000c000c171f0:  0000000000000000  0000000000000000 
000000c000c17200:  0000000000000000  0000000000000000 
000000c000c17210:  0000000000000000  0000000000000000 
000000c000c17220:  0000000000000000  0000000000000000 
000000c000c17230:  0000000000000000  0000000000000000 
000000c000c17240:  0000000000000000  0000000000000000 
000000c000c17250:  0000000000000000  0000000000000000 
000000c000c17260:  0000000000000000  0000000000000000 
000000c000c17270:  0000000000000000  0000000000000000 
000000c000c17280:  0000000000000000  0000000000000000 
000000c000c17290:  0000000000000000  0000000000000000 
000000c000c172a0:  0000000000000000  0000000000000000 
0xc000c172b0 alloc unmarked
0xc000c17380 free  unmarked
0xc000c17450 free  unmarked
0xc000c17520 free  unmarked
0xc000c175f0 free  unmarked
0xc000c176c0 free  unmarked
0xc000c17790 free  unmarked
0xc000c17860 free  unmarked
0xc000c17930 free  unmarked
0xc000c17a00 free  unmarked
0xc000c17ad0 free  unmarked
0xc000c17ba0 free  unmarked
0xc000c17c70 free  unmarked
0xc000c17d40 free  unmarked
0xc000c17e10 free  unmarked
0xc000c17ee0 alloc marked  
fatal error: found pointer to free object

runtime stack:
runtime.throw(0x517963a, 0x1c)
	/opt/ooce/go-1.16/src/runtime/panic.go:1117 +0x72
runtime.(*mspan).reportZombies(0xfffffc7feea2b2a0)
	/opt/ooce/go-1.16/src/runtime/mgcsweep.go:614 +0x376
runtime.(*mspan).sweep(0xfffffc7feea2b2a0, 0x0, 0x0)
	/opt/ooce/go-1.16/src/runtime/mgcsweep.go:442 +0x8dc
runtime.(*mcentral).uncacheSpan(0x8d734b8, 0xfffffc7feea2b2a0)
	/opt/ooce/go-1.16/src/runtime/mcentral.go:214 +0xcb
runtime.(*mcache).releaseAll(0xfffffc7fef03ab38)
	/opt/ooce/go-1.16/src/runtime/mcache.go:276 +0x136
runtime.(*mcache).prepareForSweep(0xfffffc7fef03ab38)
	/opt/ooce/go-1.16/src/runtime/mcache.go:310 +0x46
runtime.gcMarkTermination.func4.1(0xc00009e800)
	/opt/ooce/go-1.16/src/runtime/mgc.go:1766 +0x2f
runtime.forEachP(0x57b1170)
	/opt/ooce/go-1.16/src/runtime/proc.go:1499 +0x11b
runtime.gcMarkTermination.func4()
	/opt/ooce/go-1.16/src/runtime/mgc.go:1765 +0x2d
runtime.systemstack(0xfffffc7fe4610000)
	/opt/ooce/go-1.16/src/runtime/asm_amd64.s:379 +0x73
runtime.mstart()
	/opt/ooce/go-1.16/src/runtime/proc.go:1246

goroutine 11 [running]:
runtime.systemstack_switch()
	/opt/ooce/go-1.16/src/runtime/asm_amd64.s:339 fp=0xc00010cd50 sp=0xc00010cd48 pc=0xf50f80
runtime.gcMarkTermination(0x3fed9bbad1c05669)
	/opt/ooce/go-1.16/src/runtime/mgc.go:1764 +0x41f fp=0xc00010cf08 sp=0xc00010cd50 pc=0xefcd1f
runtime.gcMarkDone()
	/opt/ooce/go-1.16/src/runtime/mgc.go:1623 +0x275 fp=0xc00010cf60 sp=0xc00010cf08 pc=0xefc835
runtime.gcBgMarkWorker()
	/opt/ooce/go-1.16/src/runtime/mgc.go:2045 +0x2ab fp=0xc00010cfe0 sp=0xc00010cf60 pc=0xefd94b
runtime.goexit()
	/opt/ooce/go-1.16/src/runtime/asm_amd64.s:1371 +0x1 fp=0xc00010cfe8 sp=0xc00010cfe0 pc=0xf52e61
created by runtime.gcBgMarkStartWorkers
	/opt/ooce/go-1.16/src/runtime/mgc.go:1835 +0x37

goroutine 1 [runnable, locked to thread]:
github.com/aws/aws-sdk-go/aws/endpoints.init()
	/ws/gc/cockroach/cache/gopath/src/github.com/cockroachdb/cockroach/vendor/github.com/aws/aws-sdk-go/aws/endpoints/defaults.go:173 +0x67fac

goroutine 18 [chan receive]:
github.com/cockroachdb/cockroach/pkg/util/log.flushDaemon()
	/ws/gc/cockroach/cache/gopath/src/github.com/cockroachdb/cockroach/pkg/util/log/log_flush.go:75 +0x74
created by github.com/cockroachdb/cockroach/pkg/util/log.init.6
	/ws/gc/cockroach/cache/gopath/src/github.com/cockroachdb/cockroach/pkg/util/log/log_flush.go:41 +0x35

goroutine 19 [chan receive]:
github.com/cockroachdb/cockroach/pkg/util/log.signalFlusher()
	/ws/gc/cockroach/cache/gopath/src/github.com/cockroachdb/cockroach/pkg/util/log/log_flush.go:98 +0x12c
created by github.com/cockroachdb/cockroach/pkg/util/log.init.6
	/ws/gc/cockroach/cache/gopath/src/github.com/cockroachdb/cockroach/pkg/util/log/log_flush.go:42 +0x4d

goroutine 66 [syscall]:
os/signal.signal_recv(0x0)
	/opt/ooce/go-1.16/src/runtime/sigqueue.go:168 +0xa5
os/signal.loop()
	/opt/ooce/go-1.16/src/os/signal/signal_unix.go:23 +0x25
created by os/signal.Notify.func1.1
	/opt/ooce/go-1.16/src/os/signal/signal.go:151 +0x45

goroutine 29 [chan receive]:
github.com/cockroachdb/cockroach/pkg/util/goschedstats.init.0.func1()
	/ws/gc/cockroach/cache/gopath/src/github.com/cockroachdb/cockroach/pkg/util/goschedstats/runnable.go:91 +0x128
created by github.com/cockroachdb/cockroach/pkg/util/goschedstats.init.0
	/ws/gc/cockroach/cache/gopath/src/github.com/cockroachdb/cockroach/pkg/util/goschedstats/runnable.go:79 +0x35

Yowza. I wonder if CockroachDB is finding something invalid about its on-disk state -- if there was something not quite safe about #493.

@smklein
Copy link
Collaborator

smklein commented Dec 22, 2021

Update here after a day of debugging, here's my theory:

  1. Somehow, constructing the seed directory is done in a way that leaves a temporary directory (/tmp/crdb-base/cockroach-tempXXX/) around.
  2. The ABSOLUTE PATH to this directory exists in temp-dirs-record.txt
  3. /tmp/crdb-base is copied elsewhere, during a cargo test invocation
  4. When cockroach is started from the "new" location, it still contains temp-dirs-record.txt, which points to temporary directory in crdb-base. Cockroachdb attempts to clean up this temporary directory...
  5. ... and this is where bugs manifest. If someone else tries copying from crdb-base at the same time, they might read_dir something being concurrently deleted. Alternatively, they may attempt to process that temporary directory concurrently, which could plausibly result in corruption.

This can be tested manually with:

$ cargo build -p omicron-nexus --tests
$ ls -d $(cat /tmp/crdb-base/temp-dirs-record.txt)  # This will exist
$ cargo test -p omicron-nexus db::pagination
$ ls -d $(cat /tmp/crdb-base/temp-dirs-record.txt)  # This will no longer exist

@smklein smklein closed this as completed Dec 22, 2021
@smklein smklein reopened this Dec 22, 2021
@smklein
Copy link
Collaborator

smklein commented Dec 22, 2021

(apologies for fat-fingering tab+enter, this issue is definitely not closed yet)

@smklein
Copy link
Collaborator

smklein commented Dec 22, 2021

I've found sources from CRDB devs indicating that:

Data directories should be movable:

moving data should work as you describe. A data directory isn’t tied to its location or the underlying file system (well, shouldn’t be and we’ve never seen any bugs with that regard in our testing).

Absolute paths in CRDB probably shouldn't exist

(some issue) is likely because we're using an absolute path somewhere. This shouldn't happen as this makes it impossible to move the data directories.

@davepacheco
Copy link
Collaborator

My latest understanding is that @smklein has identified what looks like a real bug (apparently in CockroachDB, since they say that what we're doing should basically work). However, I don't see how it explains all of the behavior we've seen, including the output I pasted above.

Here's a bunch more data. I still don't fully understand what's happening.

The output I pasted above suggested to me that cockroach was crashing upon finding some corrupted local state. I can't remember how I got here, but I wanted to know if we were writing to the seed copy during the test suite. We would expect not, since we copy it for all new database instances.

I ran this to watch for writes to the seed directory:

pfexec dtrace -w -q -n 'fbt::zfs_write:entry/strstr(stringof(args[0]->v_path), "/dangerzone/omicron_tmp/crdb-base") != NULL/{ stack(); ustack(); printf("BOOM pid %6d thread %d\n", pid, tid); stop(); exit(0); }'; echo $'\a' $'\a' $'\a' $'\a'

and I ran cargo test in a loop. Eventually, the D script output:


              genunix`fop_write+0x60
              genunix`write+0x2c6
              unix`sys_syscall+0x17d

              libc.so.1`__write+0xa
              cockroach`runtime.asmsysvicall6+0x5a
              cockroach`syscall.write+0x8b
              cockroach`internal/poll.(*FD).Write+0x13a
              cockroach`os.(*File).Write+0x8e
              cockroach`github.com/cockroachdb/cockroach/pkg/util/log.(*fileSink).initializeNewOutputFile+0x1ff
              cockroach`github.com/cockroachdb/cockroach/pkg/util/log.(*syncBuffer).rotateFileLocked+0x1d8
              cockroach`github.com/cockroachdb/cockroach/pkg/util/log.(*fileSink).createFileLocked+0xbb
              cockroach`github.com/cockroachdb/cockroach/pkg/util/log.(*fileSink).output+0x19c
              cockroach`github.com/cockroachdb/cockroach/pkg/util/log.(*loggerT).outputLogEntry+0x434
              cockroach`github.com/cockroachdb/cockroach/pkg/util/log.ApplyConfig+0x4c5
              cockroach`github.com/cockroachdb/cockroach/pkg/cli.setupLogging+0x4a5
              cockroach`github.com/cockroachdb/cockroach/pkg/cli.setupAndInitializeLoggingAndProfiling+0x6e
              cockroach`github.com/cockroachdb/cockroach/pkg/cli.runStart+0x24f
              cockroach`github.com/cockroachdb/cockroach/pkg/cli.runStartSingleNode+0xf3
              cockroach`github.com/cockroachdb/cockroach/pkg/cli.MaybeDecorateGRPCError.func1+0x79
              cockroach`github.com/cockroachdb/cockroach/pkg/cli.maybeShoutError.func1+0x51
              cockroach`github.com/spf13/cobra.(*Command).execute+0x472
              cockroach`github.com/spf13/cobra.(*Command).ExecuteC+0x30b
              cockroach`github.com/cockroachdb/cockroach/pkg/cli.doMain+0x11a
BOOM pid  25047 thread 1

around the exact time that a bunch of tests failed:

attempt: 76
    Finished test [unoptimized + debuginfo] target(s) in 0.29s
     Running unittests (target/debug/deps/omicron_nexus-469bcf46f8ee518e)

running 52 tests
test authn::external::cookies::test::test_parse_cookies_empty_headers ... ok
test authn::external::session_cookie::test::test_get_token_no_header ... ok
test authn::external::cookies::test::test_parse_cookies_two_cookie_headers_same_name ... ok
test authn::external::cookies::test::test_parse_cookies_two_cookie_headers ... ok
test authn::external::cookies::test::test_parse_cookies_two_cookies ... ok
test authn::external::session_cookie::test::test_get_token_other_cookie_present ... ok
test authn::external::session_cookie::test::test_get_token ... ok
test authn::external::cookies::test::test_parse_cookies_ignore_other_headers ... ok
test authn::external::cookies::test::test_parse_cookies_one_cookie ... ok
test authn::external::session_cookie::test::test_session_cookie_value ... ok
test authn::external::spoof::test::test_spoof_header_missing ... ok
test authn::external::spoof::test::test_spoof_header_valid ... ok
test authn::external::spoof::test::test_spoof_reserved_values ... ok
test authn::test::test_internal_users ... ok
test authn::external::spoof::test::test_spoof_header_bad_uuids ... ok
test authn::external::session_cookie::test::test_valid_cookie ... ok
test authn::external::session_cookie::test::test_missing_cookie ... ok
test authn::external::session_cookie::test::test_other_cookie ... ok
test authn::external::session_cookie::test::test_garbage_cookie ... ok
test authn::external::session_cookie::test::test_expired_cookie_idle ... ok
test authn::external::session_cookie::test::test_expired_cookie_absolute ... ok
test config::test::test_config_nonexistent ... ok
test authn::external::test::test_authn_sequence ... ok
test config::test::test_config_empty ... ok
test config::test::test_config_bad_toml ... ok
test db::fixed_data::test::test_builtin_fleet_id_is_valid ... ok
test db::fixed_data::user_builtin::test::test_builtin_user_ids_are_valid ... ok
test db::collection_insert::test::test_verify_query ... ok
test config::test::test_bad_authn_schemes ... ok
test config::test::test_valid ... ok
test db::subnet_allocation::test::test_verify_query ... ok
test external_api::console_api::test::test_find_file_404_on_disallowed_ext ... ok
test external_api::console_api::test::test_find_file_404_on_directory ... ok
test external_api::console_api::test::test_find_file_404_on_nonexistent ... ok
test external_api::console_api::test::test_find_file_404_on_symlink ... ok
test external_api::console_api::test::test_find_file_404_on_nonexistent_nested ... ok
test external_api::console_api::test::test_find_file_finds_file ... ok
test external_api::console_api::test::test_find_file_wont_follow_symlink ... ok
test db::collection_insert::test::test_collection_not_present ... FAILED
test context::test::test_test_context ... FAILED
test db::datastore::test::test_project_creation ... FAILED
test authz::context::test::test_database ... FAILED
test db::pagination::test::test_paginated_single_column_descending ... FAILED
test db::saga_recovery::test::test_failure_during_saga_can_be_recovered ... FAILED
test authz::context::test::test_organization ... FAILED
test db::collection_insert::test::test_collection_present ... FAILED
test db::pagination::test::test_paginated_multicolumn_ascending ... FAILED
test db::pagination::test::test_paginated_single_column_ascending ... FAILED
test context::test::test_background_context ... FAILED
test db::saga_recovery::test::test_successful_saga_does_not_replay_during_recovery ... FAILED
test db::pagination::test::test_paginated_multicolumn_descending ... FAILED
test db::datastore::test::test_session_methods ... FAILED

failures:

---- db::collection_insert::test::test_collection_not_present stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_collection_not_present.25051.3.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_collection_not_present.25051.3.log"
thread 'db::collection_insert::test::test_collection_not_present' panicked at 'called `Result::unwrap()` on an `Err` value: Query(DatabaseError(Unknown, "database \"omicron\" does not exist"))', nexus/src/db/collection_insert.rs:546:14
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- context::test::test_test_context stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_background_context.25051.2.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_background_context.25051.2.log"
thread 'context::test::test_test_context' panicked at 'called `Result::unwrap()` on an `Err` value: InternalError { internal_message: "unexpected database error: relation \"user_builtin\" does not exist" }', nexus/src/db/datastore.rs:2336:48
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- db::datastore::test::test_project_creation stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_project_creation.25051.6.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_project_creation.25051.6.log"
thread 'db::datastore::test::test_project_creation' panicked at 'called `Result::unwrap()` on an `Err` value: InternalError { internal_message: "unexpected database error: relation \"user_builtin\" does not exist" }', nexus/src/db/datastore.rs:2336:48
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- authz::context::test::test_database stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_database.25051.0.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_database.25051.0.log"
thread 'authz::context::test::test_database' panicked at 'called `Result::unwrap()` on an `Err` value: InternalError { internal_message: "unexpected database error: relation \"user_builtin\" does not exist" }', nexus/src/db/datastore.rs:2336:48
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- db::pagination::test::test_paginated_single_column_descending stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_paginated_single_column_descending.25051.8.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_paginated_single_column_descending.25051.8.log"
thread 'db::pagination::test::test_paginated_single_column_descending' panicked at 'called `Result::unwrap()` on an `Err` value: Query(DatabaseError(Unknown, "no database or schema specified"))', nexus/src/db/pagination.rs:214:14
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- db::saga_recovery::test::test_failure_during_saga_can_be_recovered stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_failure_during_saga_can_be_recovered.25051.9.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_failure_during_saga_can_be_recovered.25051.9.log"
thread 'db::saga_recovery::test::test_failure_during_saga_can_be_recovered' panicked at 'called `Result::unwrap()` on an `Err` value: creating saga record

Caused by:
    0: creating saga record
    1: Internal Error: database error (kind = Unknown): relation "saga" does not exist
       ', nexus/src/db/saga_recovery.rs:503:14
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- authz::context::test::test_organization stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_database.25051.1.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_database.25051.1.log"
thread 'authz::context::test::test_organization' panicked at 'called `Result::unwrap()` on an `Err` value: InternalError { internal_message: "unexpected database error: relation \"user_builtin\" does not exist" }', nexus/src/db/datastore.rs:2336:48
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- db::collection_insert::test::test_collection_present stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_collection_present.25051.12.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_collection_present.25051.12.log"
thread 'db::collection_insert::test::test_collection_present' panicked at 'called `Result::unwrap()` on an `Err` value: Query(DatabaseError(Unknown, "database \"omicron\" does not exist"))', nexus/src/db/collection_insert.rs:546:14
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- db::pagination::test::test_paginated_multicolumn_ascending stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_paginated_multicolumn_ascending.25051.5.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_paginated_multicolumn_ascending.25051.5.log"
thread 'db::pagination::test::test_paginated_multicolumn_ascending' panicked at 'called `Result::unwrap()` on an `Err` value: Query(DatabaseError(Unknown, "no database or schema specified"))', nexus/src/db/pagination.rs:214:14
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- db::pagination::test::test_paginated_single_column_ascending stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_paginated_single_column_ascending.25051.10.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_paginated_single_column_ascending.25051.10.log"
thread 'db::pagination::test::test_paginated_single_column_ascending' panicked at 'called `Result::unwrap()` on an `Err` value: Query(DatabaseError(Unknown, "no database or schema specified"))', nexus/src/db/pagination.rs:214:14
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- context::test::test_background_context stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_background_context.25051.13.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_background_context.25051.13.log"
thread 'context::test::test_background_context' panicked at 'called `Result::unwrap()` on an `Err` value: InternalError { internal_message: "unexpected database error: relation \"user_builtin\" does not exist" }', nexus/src/db/datastore.rs:2336:48
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- db::saga_recovery::test::test_successful_saga_does_not_replay_during_recovery stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_successful_saga_does_not_replay_during_recovery.25051.11.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_successful_saga_does_not_replay_during_recovery.25051.11.log"
thread 'db::saga_recovery::test::test_successful_saga_does_not_replay_during_recovery' panicked at 'called `Result::unwrap()` on an `Err` value: creating saga record

Caused by:
    0: creating saga record
    1: Internal Error: database error (kind = Unknown): relation "saga" does not exist
       ', nexus/src/db/saga_recovery.rs:570:14
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- db::pagination::test::test_paginated_multicolumn_descending stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_paginated_multicolumn_descending.25051.7.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_paginated_multicolumn_descending.25051.7.log"
thread 'db::pagination::test::test_paginated_multicolumn_descending' panicked at 'called `Result::unwrap()` on an `Err` value: Query(DatabaseError(Unknown, "no database or schema specified"))', nexus/src/db/pagination.rs:214:14
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)

---- db::datastore::test::test_session_methods stdout ----
log file: "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_session_methods.25051.4.log"
note: configured to log to "/dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_session_methods.25051.4.log"
thread 'db::datastore::test::test_session_methods' panicked at 'called `Result::unwrap()` on an `Err` value: InternalError { internal_message: "unexpected database error: relation \"user_builtin\" does not exist" }', nexus/src/db/datastore.rs:2336:48
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)


failures:
    authz::context::test::test_database
    authz::context::test::test_organization
    context::test::test_background_context
    context::test::test_test_context
    db::collection_insert::test::test_collection_not_present
    db::collection_insert::test::test_collection_present
    db::datastore::test::test_project_creation
    db::datastore::test::test_session_methods
    db::pagination::test::test_paginated_multicolumn_ascending
    db::pagination::test::test_paginated_multicolumn_descending
    db::pagination::test::test_paginated_single_column_ascending
    db::pagination::test::test_paginated_single_column_descending
    db::saga_recovery::test::test_failure_during_saga_can_be_recovered
    db::saga_recovery::test::test_successful_saga_does_not_replay_during_recovery

test result: FAILED. 38 passed; 14 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.89s

error: test failed, to rerun pass '-p omicron-nexus --lib'

They failed in strange ways, all seeming to reflect broken local database state. That could make sense, if something had mucked with the seed copy while they were running. What was it that did that? From the DTrace output above, it was cockroach itself! What else do we know about that cockroach process?

dap@ivanova omicron $ ptree 25047
25047  cockroach start-single-node --insecure --http-addr=:0 --store /dangerzone/omicr

dap@ivanova omicron $ pargs 25047
25047:  cockroach start-single-node --insecure --http-addr=:0 --store /dangerzone/omicr
argv[0]: cockroach
argv[1]: start-single-node
argv[2]: --insecure
argv[3]: --http-addr=:0
argv[4]: --store
argv[5]: /dangerzone/omicron_tmp/crdb-base
argv[6]: --listen-addr
argv[7]: 127.0.0.1:0
argv[8]: --listening-url-file
argv[9]: /dangerzone/omicron_tmp/.tmpfxlogZ/listen-url
dap@ivanova omicron $ ps -ostime,etime -p 25047
   STIME     ELAPSED
14:58:43       08:33
$ pfiles 25047
25047:	cockroach start-single-node --insecure --http-addr=:0 --store /dangerz
  Current rlimit: 65536 file descriptors
   0: S_IFCHR mode:0666 dev:536,0 ino:68157448 uid:0 gid:3 rdev:130,2
      O_RDONLY|O_LARGEFILE
      /devices/pseudo/mm@0:null
      offset:0
   1: S_IFREG mode:0644 dev:259,65543 ino:98341 uid:501 gid:10 size:0
      O_WRONLY|O_CREAT|O_EXCL|O_LARGEFILE
      /dangerzone/omicron_tmp/.tmpfxlogZ/cockroachdb_stdout
      offset:0
   2: S_IFREG mode:0644 dev:259,65543 ino:98342 uid:501 gid:10 size:0
      O_WRONLY|O_CREAT|O_EXCL|O_LARGEFILE
      /dangerzone/omicron_tmp/.tmpfxlogZ/cockroachdb_stderr
      offset:0
   3: S_IFIFO mode:0000 dev:534,0 ino:1149995 uid:501 gid:10 rdev:0,0
      O_RDWR
   4: S_IFREG mode:0644 dev:259,65543 ino:98342 uid:501 gid:10 size:0
      O_WRONLY|O_CREAT|O_EXCL|O_LARGEFILE FD_CLOEXEC
      /dangerzone/omicron_tmp/.tmpfxlogZ/cockroachdb_stderr
      offset:0
   5: S_IFSOCK mode:0666 dev:547,0 ino:752079380 uid:0 gid:0 rdev:0,0
      O_RDWR|O_NONBLOCK
	SOCK_STREAM
	SO_SNDBUF(16384),SO_RCVBUF(5120)
	sockname: AF_UNIX 
	peer: pid 14158 zone: global[0]
   6: S_IFSOCK mode:0666 dev:547,0 ino:754833742 uid:0 gid:0 rdev:0,0
      O_RDWR|O_NONBLOCK
	SOCK_STREAM
	SO_SNDBUF(16384),SO_RCVBUF(5120)
	sockname: AF_UNIX 
	peer: pid 14158 zone: global[0]
   7: S_IFIFO mode:0000 dev:534,0 ino:1149995 uid:501 gid:10 rdev:0,0
      O_RDWR
   8: S_IFPORT mode:0000 dev:548,0 uid:501 gid:10 size:0
   9: S_IFDOOR mode:0444 dev:538,0 ino:52 uid:0 gid:0 rdev:540,0
      O_RDONLY|O_LARGEFILE FD_CLOEXEC  door to nscd[195]
  10: S_IFCHR mode:0644 dev:536,0 ino:91226118 uid:0 gid:3 rdev:174,1
      O_RDONLY|O_NONBLOCK|O_LARGEFILE FD_CLOEXEC
      /devices/pseudo/random@0:urandom
      offset:16
  11: S_IFREG mode:0640 dev:259,65543 ino:98701 uid:501 gid:10 size:111
      O_WRONLY|O_NONBLOCK|O_APPEND|O_CREAT|O_LARGEFILE FD_CLOEXEC
      offset:111
dap@ivanova omicron $ cat /dangerzone/omicron_tmp/.tmpfxlogZ/cockroachdb_stderr
dap@ivanova omicron $ cat /dangerzone/omicron_tmp/.tmpfxlogZ/cockroachdb_stdout
dap@ivanova omicron $ pwdx 25047
25047:  /home/dap/omicron-auth/nexus/test-utils
dap@ivanova omicron $ pgrep -f cockroach
25047

There are a few things worth noting here:

  • the process started at 14:58:43. That's pretty close to the mtime on one of the log files from the tests that failed:
    $ ls -le /dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_collection_not_present.25051.3.log
    -rw-r--r--   1 dap      staff       1135 Dec 22 14:58:43 2021 /dangerzone/omicron_tmp/omicron_nexus-469bcf46f8ee518e-test_collection_not_present.25051.3.log
    
    That was after running the test suite in a loop for several minutes. This suggests to me that this process was created and tried to write to the seed directory right when all the tests failed.
  • This process's store directory is "/dangerzone/omicron_tmp/crdb-base". What! Sean and I have looked through the code and can't see how that could be possible for anything in the test suite.
  • This is the only cockroach process running at this point.
  • The working directory is ``nexus/test-utils`. That's a crate with no tests of its own, but it is the one that's responsible for building the seed in the first place.

I searched $TMPDIR (/dangerzone/omicron_tmp) for any output from a test that said it was invoking cockroach with that --store-dir argument, but didn't find any:

$ grep -r 'cockroach command line' /dangerzone/omicron_tmp | grep 'store /dangerzone/omicron_tmp/crdb-base'
$

I am really regretting that I didn't grab output of pargs -e from pid 25047. The Cargo environment variables may have told us where that process came from.

grep -r .tmpfxlogZ $TMPDIR also showed no results. I found that surprising.

Note that while repro'ing, I reverted #544 because it seemed like it was making this harder to hit. So I was running at a5713c6, plus a revert of commit e262a05. I applied this local change to attempt to preserve all database temp directories, regardless of failure:

$ git diff
diff --git a/test-utils/src/dev/db.rs b/test-utils/src/dev/db.rs
index 88361210..ee616b40 100644
--- a/test-utils/src/dev/db.rs
+++ b/test-utils/src/dev/db.rs
@@ -140,7 +140,7 @@ impl CockroachStarterBuilder {
      * This is where the database will store all of its on-disk data.  If this
      * isn't specified, CockroachDB will be configured to store data into a
      * temporary directory that will be cleaned up on Drop of
-     * [`CockroachStarter`] or [`CockroachInstance`].
+     * [`CockroachStarter`] or on [`CockroachInstance::cleanup()`].
      */
     pub fn store_dir<P: AsRef<Path>>(mut self, store_dir: P) -> Self {
         self.store_dir.replace(store_dir.as_ref().to_owned());
@@ -540,7 +540,9 @@ impl CockroachInstance {
         }
 
         if let Some(temp_dir) = self.temp_dir.take() {
-            temp_dir.close().context("cleaning up temporary directory")?;
+            // temp_dir.close().context("cleaning up temporary directory")?; //
+            // XXX
+            temp_dir.into_path();
         }
 
         Ok(())
@@ -572,7 +574,8 @@ impl Drop for CockroachInstance {
             }
             #[allow(unused_must_use)]
             if let Some(temp_dir) = self.temp_dir.take() {
-                temp_dir.close();
+                temp_dir.into_path();
+                // temp_dir.close(); // XXX
             }
         }
     }

Then I appeared to reproduce the problem again (although admittedly with a new failure mode):

test authz::context::test::test_database ... ok
test db::pagination::test::test_paginated_single_column_descending ... FAILED
test db::collection_insert::test::test_collection_present ... ok
test db::collection_insert::test::test_collection_not_present ... ok

failures:

---- db::pagination::test::test_paginated_single_column_descending stdout ----
log file: "/dangerzone/omicron_tmp2/omicron_nexus-469bcf46f8ee518e-test_paginated_single_column_descending.23073.9.log"
note: configured to log to "/dangerzone/omicron_tmp2/omicron_nexus-469bcf46f8ee518e-test_paginated_single_column_descending.23073.9.log"
thread 'db::pagination::test::test_paginated_single_column_descending' panicked at 'called `Result::unwrap()` on an `Err` value: Connection(Query(DatabaseError(ClosedConnection, "server closed the connection unexpectedly\n\tThis probably means the server terminated abnormally\n\tbefore or while processing the request.\n")))', nexus/src/db/pagination.rs:237:71
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
WARN: dropped CockroachInstance without cleaning it up first (there may still be a child process running and a temporary directory leaked)


failures:
    db::pagination::test::test_paginated_single_column_descending

test result: FAILED. 51 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 1.00s

(At this point I'm using time for (( i = 0; ; i++ )) { echo "attempt: $(( i + 1 ))"; cargo test -p omicron-nexus --lib || break; }; echo $i; echo $'\a' $'\a' $'\a' $'\a' to attempt to reproduce the problem. This one took 8m and 320 iterations.)

I did this with a clean TMPDIR=/dangerzone/omicron_tmp2. I still found no test that appeared to run cockroach with the errant --store-dir:

$ grep -r 'cockroach command line' /dangerzone/omicron_tmp2 | grep 'store /dangerzone/omicron_tmp2/crdb-base'
$

At this point, I wonder if cargo is somehow rebuilding test-utils, causing us to regenerate the seed, while the test suite is running. I will try to repro using the D script and look at the environment of the errant process.

@davepacheco
Copy link
Collaborator

So, here's another data point, embarassing confession, and another problem: since these runs take so long, I was working on #546 in parallel. You guessed it: on the same machine, with the same value of TMPDIR. Now, it's conceivable that while building that clone, I triggered a build of test-utils that kicked off the cockroach process that DTrace found. That said, from the output I have from the other workspace, that didn't happen. The last time I built it was 14:50, which was 8m before the failure. But it's possible that like rust-analyzer did it or something. I think it's a legit problem that if you were to build from two clones concurrently on the same machine (and with the same TMPDIR), the second one could totally destroy the first one's test run. I'll file a new issue for that.

@davepacheco
Copy link
Collaborator

Updated D script:

pfexec dtrace -w -q -n 'BEGIN{ printf("ready\n"); } exec-success/progenyof(23278)/{ printf("%Y pid %6d ppid %6d exec-success %s\n", walltimestamp, pid, ppid, curpsinfo->pr_psargs); } fbt::zfs_write:entry/strstr(stringof(args[0]->v_path), "/dangerzone/omicron_tmp3/crdb-base") != NULL/{ stack(); ustack(); printf("%Y BOOM pid %6d thread %d ppid %d\n", walltimestamp, pid, tid, ppid); stop(); system("pargs %d; pargs -e %d; pwdx %d; pfiles %d; ps -opid,stime,etime -p %d", pid, pid, pid, pid, pid); exit(0); }'; echo $'\a' $'\a' $'\a' $'\a'

Working on reproducing with:

time for (( i = 0; ; i++ )) { echo "attempt: $(( i + 1 ))"; cargo test -p omicron-nexus --lib || break; }; echo $i; echo $'\a' $'\a' $'\a' $'\a'

@davepacheco
Copy link
Collaborator

I ran that for a while without reproducing a problem, so I decided to make my repro loop better reflect what happens in CI:

time for (( i = 0; ; i++ )) { echo "attempt: $(( i + 1 ))"; cargo clean || break; rm -rf $TMPDIR || break; mkdir $TMPDIR || break; cargo test || break ;}; echo $i

After 5 iterations (30 minutes), this produced:

attempt: 6
   Compiling proc-macro2 v1.0.33
   Compiling unicode-xid v0.2.2
   Compiling syn v1.0.82
   Compiling libc v0.2.112
   Compiling serde_derive v1.0.131
   Compiling serde v1.0.131
   Compiling autocfg v1.0.1
   Compiling cfg-if v1.0.0
   Compiling version_check v0.9.3
   Compiling memchr v2.4.1
   Compiling ryu v1.0.6
   Compiling serde_json v1.0.72
   Compiling itoa v0.4.8
   Compiling log v0.4.14
   Compiling parking_lot_core v0.8.5
   Compiling typenum v1.14.0
   Compiling lazy_static v1.4.0
   Compiling futures-core v0.3.18
   Compiling ucd-trie v0.1.3
   Compiling once_cell v1.8.0
   Compiling futures-task v0.3.18
   Compiling futures-channel v0.3.18
   Compiling futures-util v0.3.18
   Compiling cc v1.0.72
   Compiling scopeguard v1.1.0
   Compiling hashbrown v0.11.2
   Compiling smallvec v1.7.0
   Compiling ppv-lite86 v0.2.15
   Compiling maplit v1.0.2
   Compiling slog v2.7.0
   Compiling byteorder v1.4.3
   Compiling regex-syntax v0.6.25
   Compiling dyn-clone v1.0.4
   Compiling pin-project-lite v0.2.7
   Compiling httparse v1.5.1
   Compiling fnv v1.0.7
   Compiling schemars v0.8.8
   Compiling remove_dir_all v0.5.3
   Compiling async-trait v0.1.52
   Compiling anyhow v1.0.51
   Compiling futures-sink v0.3.18
   Compiling same-file v1.0.6
   Compiling crossbeam-utils v0.8.5
   Compiling pkg-config v0.3.23
   Compiling slab v0.4.5
   Compiling home v0.5.3
   Compiling unicode-width v0.1.9
   Compiling futures-io v0.3.18
   Compiling pin-utils v0.1.0
   Compiling unicode-segmentation v1.8.0
   Compiling slog-async v2.7.0
   Compiling convert_case v0.4.0
   Compiling openssl v0.10.38
   Compiling encoding_rs v0.8.30
   Compiling paste v1.0.6
   Compiling native-tls v0.2.8
   Compiling semver v0.1.20
   Compiling strsim v0.10.0
   Compiling base64 v0.13.0
   Compiling cpufeatures v0.2.1
   Compiling ident_case v1.0.1
   Compiling subtle v2.4.1
   Compiling rustversion v1.0.6
   Compiling tinyvec_macros v0.1.0
   Compiling bitflags v1.3.2
   Compiling percent-encoding v2.1.0
   Compiling try-lock v0.2.3
   Compiling matches v0.1.9
   Compiling unicode-bidi v0.3.7
   Compiling untrusted v0.7.1
   Compiling siphasher v0.3.7
   Compiling tower-service v0.3.1
   Compiling httpdate v1.0.2
   Compiling spin v0.5.2
   Compiling match_cfg v0.1.0
   Compiling take_mut v0.2.2
   Compiling strsim v0.8.0
   Compiling vec_map v0.8.2
   Compiling ansi_term v0.12.1
   Compiling fallible-iterator v0.2.0
   Compiling mime v0.3.16
   Compiling ipnet v2.3.1
   Compiling foreign-types-shared v0.1.1
   Compiling openssl-probe v0.1.4
   Compiling fixedbitset v0.4.0
   Compiling radium v0.6.2
   Compiling either v1.6.1
   Compiling signal-hook v0.3.12
   Compiling opaque-debug v0.3.0
   Compiling tap v1.0.1
   Compiling difference v2.0.0
   Compiling const-oid v0.6.2
   Compiling funty v1.2.0
   Compiling newline-converter v0.2.0
   Compiling crunchy v0.2.2
   Compiling pq-sys v0.4.6 (https://github.com/oxidecomputer/pq-sys?branch=oxide/omicron#b1194c19)
   Compiling linked-hash-map v0.5.4
   Compiling proc-macro-hack v0.5.19
   Compiling tiny-keccak v2.0.2
   Compiling dtoa v0.4.8
   Compiling precomputed-hash v0.1.1
   Compiling bit-vec v0.6.3
   Compiling half v1.8.2
   Compiling new_debug_unreachable v1.0.4
   Compiling fixedbitset v0.2.0
   Compiling diff v0.1.12
   Compiling lalrpop-util v0.19.6
   Compiling const_fn v0.4.8
   Compiling unindent v0.1.7
   Compiling semver v1.0.4
   Compiling crossbeam-epoch v0.9.5
   Compiling ref-cast v1.0.6
   Compiling omicron-rpaths v0.1.0 (/home/dap/omicron/rpaths)
   Compiling rayon-core v1.9.1
   Compiling predicates-core v1.0.2
   Compiling difflib v0.4.0
   Compiling termtree v0.2.3
   Compiling normalize-line-endings v0.3.0
   Compiling plotters-backend v0.3.2
   Compiling regex-automata v0.1.10
   Compiling downcast v0.11.0
   Compiling fragile v1.0.0
   Compiling hex v0.4.3
   Compiling erased-serde v0.3.16
   Compiling array-init v2.0.0
   Compiling oorandom v11.1.3
   Compiling trybuild v1.0.53
   Compiling glob v0.3.0
   Compiling termcolor v1.1.2
   Compiling utf-8 v0.7.6
   Compiling viona_api v0.1.0 (https://github.com/oxidecomputer/propolis?rev=00ec8cf18f6a2311b0907f0b16b0ff8a327944d1#00ec8cf1)
   Compiling instant v0.1.12
   Compiling tracing-core v0.1.21
   Compiling sharded-slab v0.1.4
   Compiling pest v2.1.3
   Compiling thread_local v1.1.3
   Compiling lock_api v0.4.5
   Compiling generic-array v0.14.4
   Compiling proc-macro-error-attr v1.0.4
   Compiling proc-macro-error v1.0.4
   Compiling standback v0.2.17
   Compiling unicase v2.6.0
   Compiling time v0.2.27
   Compiling cookie v0.15.1
   Compiling num-integer v0.1.44
   Compiling tokio v1.14.0
   Compiling indexmap v1.7.0
   Compiling num-traits v0.2.14
   Compiling memoffset v0.6.5
   Compiling rayon v1.5.1
   Compiling walkdir v2.3.2
   Compiling getopts v0.2.21
   Compiling textwrap v0.11.0
   Compiling rustc_version v0.1.7
   Compiling heck v0.3.3
   Compiling rustls-pemfile v0.2.1
   Compiling tinyvec v1.5.1
   Compiling form_urlencoded v1.0.1
   Compiling phf_shared v0.10.0
   Compiling phf_shared v0.8.0
   Compiling foreign-types v0.3.2
   Compiling ring v0.16.20
   Compiling openssl-sys v0.9.71
   Compiling itertools v0.10.3
   Compiling wyz v0.4.0
   Compiling der v0.4.5
   Compiling expectorate v1.0.4
   Compiling yaml-rust v0.4.5
   Compiling bit-set v0.5.2
   Compiling indoc v1.0.3
   Compiling omicron-nexus v0.1.0 (/home/dap/omicron/nexus)
   Compiling predicates-tree v1.0.4
   Compiling ringbuffer v0.8.2
   Compiling plotters-svg v0.3.1
   Compiling tracing v0.1.29
   Compiling pest_meta v2.1.3
   Compiling semver-parser v0.10.2
   Compiling tracing-subscriber v0.3.3
   Compiling phf v0.10.0
   Compiling newtype_derive v0.1.6
   Compiling unicode-normalization v0.1.19
   Compiling spki v0.4.1
   Compiling want v0.3.0
   Compiling ena v0.14.0
   Compiling aho-corasick v0.7.18
   Compiling csv-core v0.1.10
   Compiling getrandom v0.2.3
   Compiling time v0.1.44
   Compiling thread-id v4.0.0
   Compiling num_cpus v1.13.0
   Compiling signal-hook-registry v1.4.0
   Compiling mio v0.7.14
   Compiling atty v0.2.14
   Compiling dirs-sys-next v0.1.2
   Compiling hostname v0.3.1
   Compiling socket2 v0.4.2
   Compiling subprocess v0.2.8
   Compiling xattr v0.2.2
   Compiling filetime v0.2.15
   Compiling quote v1.0.10
   Compiling crossbeam-channel v0.5.1
   Compiling semver v0.11.0
   Compiling bitvec v0.22.3
   Compiling rustc_version v0.4.0
   Compiling idna v0.2.3
   Compiling stringprep v0.1.2
   Compiling pkcs8 v0.7.6
   Compiling rand_core v0.6.3
   Compiling parking_lot v0.11.2
   Compiling clap v2.34.0
   Compiling dirs-next v2.0.0
   Compiling regex v1.5.4
   Compiling tar v0.4.37
   Compiling mime_guess v2.0.3
   Compiling block-buffer v0.10.0
   Compiling crypto-common v0.1.1
   Compiling digest v0.9.0
   Compiling block-buffer v0.9.0
   Compiling crypto-mac v0.11.1
   Compiling cipher v0.3.0
   Compiling universal-hash v0.4.1
   Compiling aead v0.4.3
   Compiling float-cmp v0.9.0
   Compiling plotters v0.3.1
   Compiling rand_chacha v0.3.1
   Compiling url v2.2.2
   Compiling term v0.7.0
   Compiling cast v0.2.7
   Compiling string_cache v0.8.2
   Compiling scheduled-thread-pool v0.2.5
   Compiling webpki v0.22.0
   Compiling sct v0.7.0
   Compiling webpki v0.21.4
   Compiling ff v0.10.1
   Compiling crossbeam-deque v0.8.1
   Compiling toolchain_find v0.2.0
   Compiling digest v0.10.1
   Compiling signature v1.3.2
   Compiling sha2 v0.9.8
   Compiling sha-1 v0.9.8
   Compiling hmac v0.11.0
   Compiling polyval v0.5.3
   Compiling ctr v0.8.0
   Compiling aes v0.7.5
   Compiling xts-mode v0.4.1
   Compiling predicates v2.1.0
   Compiling rand v0.8.4
   Compiling ascii-canvas v3.0.0
   Compiling r2d2 v0.8.9
   Compiling rustls v0.20.2
   Compiling group v0.10.0
   Compiling webpki-roots v0.21.1
   Compiling md-5 v0.10.0
   Compiling sha2 v0.10.0
   Compiling hmac v0.12.0
   Compiling synstructure v0.12.6
   Compiling serde_derive_internals v0.25.0
   Compiling pest_generator v2.1.3
   Compiling darling_core v0.13.1
   Compiling tempfile v3.2.0
   Compiling spdm v0.1.0 (https://github.com/oxidecomputer/spdm?rev=9742f6e#9742f6ea)
   Compiling criterion-plot v0.4.4
   Compiling thiserror-impl v1.0.30
   Compiling tokio-macros v1.6.0
   Compiling futures-macro v0.3.18
   Compiling zerocopy-derive v0.2.0
   Compiling schemars_derive v0.8.8
   Compiling pest_derive v2.1.0
   Compiling tracing-attributes v0.1.18
   Compiling structopt-derive v0.4.18
   Compiling pin-project-internal v1.0.8
   Compiling structmeta-derive v0.1.4
   Compiling api_identity v0.1.0 (/home/dap/omicron/api_identity)
   Compiling zeroize_derive v1.2.2
   Compiling zone_cfg_derive v0.1.2
   Compiling diesel_derives v2.0.0 (https://github.com/diesel-rs/diesel?rev=ce77c382#ce77c382)
   Compiling time-macros-impl v0.1.2
   Compiling ref-cast-impl v1.0.6
   Compiling oso-derive v0.24.0
   Compiling impl-trait-for-tuples v0.2.1
   Compiling db-macros v0.1.0 (/home/dap/omicron/nexus/src/db/db-macros)
   Compiling serial_test_derive v0.5.1
   Compiling mockall_derive v0.11.0
   Compiling derivative v2.2.0
   Compiling bitstruct_derive v0.1.0
   Compiling nexus-test-utils-macros v0.1.0 (/home/dap/omicron/nexus/test-utils-macros)
   Compiling darling_macro v0.13.1
   Compiling zeroize v1.3.0
   Compiling zerocopy v0.3.0
   Compiling time-macros v0.1.1
   Compiling structopt v0.3.25
   Compiling structmeta v0.1.4
   Compiling serial_test v0.5.1
   Compiling pin-project v1.0.8
   Compiling crypto-bigint v0.2.11
   Compiling curve25519-dalek v4.0.0-pre.1
   Compiling aes-gcm-siv v0.10.3
   Compiling thiserror v1.0.30
   Compiling bitstruct v0.1.1
   Compiling darling v0.13.1
   Compiling parse-display-derive v0.5.3
   Compiling dtrace-parser v0.1.12
   Compiling dof v0.1.5
   Compiling rustfmt-wrapper v0.1.0
   Compiling smf v0.2.1
   Compiling zone v0.1.8
   Compiling serde_with_macros v1.5.1
   Compiling elliptic-curve v0.10.6
   Compiling futures-executor v0.3.18
   Compiling ecdsa v0.12.4
   Compiling futures v0.3.18
   Compiling k256 v0.9.6
   Compiling p256 v0.9.0
   Compiling mockall v0.11.0
   Compiling parse-display v0.5.3
   Compiling bytes v1.1.0
   Compiling uuid v0.8.2
   Compiling serde_tokenstream v0.1.2
   Compiling chrono v0.4.19
   Compiling toml v0.5.8
   Compiling serde_urlencoded v0.7.0
   Compiling ipnetwork v0.18.0
   Compiling macaddr v1.0.1
   Compiling serde_with v1.11.0
   Compiling bincode v1.3.3
   Compiling serde_cbor v0.11.2
   Compiling serde-big-array v0.3.2
   Compiling bstr v0.2.17
   Compiling serde_arrays v0.1.0
   Compiling dropshot_endpoint v0.6.1-dev (https://github.com/oxidecomputer/dropshot?branch=main#72583242)
   Compiling http v0.2.5
   Compiling postgres-protocol v0.6.3
   Compiling input_buffer v0.4.0
   Compiling petgraph v0.6.0
   Compiling serde_yaml v0.8.21
   Compiling petgraph v0.5.1
   Compiling vsss-rs v2.0.0-pre0
   Compiling oximeter-macro-impl v0.1.0 (/home/dap/omicron/oximeter/oximeter-macro-impl)
   Compiling csv v1.1.6
   Compiling slog-term v2.8.0
   Compiling usdt-impl v0.3.1
   Compiling openapiv3 v1.0.0-beta.5
   Compiling slog-json v2.4.0
   Compiling usdt-impl v0.2.1
   Compiling diesel v2.0.0 (https://github.com/diesel-rs/diesel?rev=ce77c382#ce77c382)
   Compiling tinytemplate v1.2.1
   Compiling proc-macro-crate v1.1.0
   Compiling crucible-common v0.0.0 (https://github.com/oxidecomputer/crucible?branch=main#4f02540a)
   Compiling postgres-types v0.2.2
   Compiling openapiv3 v0.5.0
   Compiling lalrpop v0.19.6
   Compiling http-body v0.4.4
   Compiling tungstenite v0.13.0
   Compiling slog-bunyan v2.3.0
   Compiling typify-impl v0.0.6-dev (https://github.com/oxidecomputer/typify#80b510b0)
   Compiling oximeter v0.1.0 (/home/dap/omicron/oximeter/oximeter)
   Compiling usdt-macro v0.3.1
   Compiling usdt-attr-macro v0.3.1
   Compiling num_enum_derive v0.5.4
   Compiling usdt-attr-macro v0.2.1
   Compiling usdt-macro v0.2.1
   Compiling tokio-util v0.6.9
   Compiling tokio-rustls v0.23.1
   Compiling tokio-native-tls v0.3.0
   Compiling backoff v0.3.0
   Compiling steno v0.1.0 (https://github.com/oxidecomputer/steno?branch=main#c6ea85cf)
   Compiling signal-hook-tokio v0.3.0
   Compiling bb8 v0.7.1
   Compiling criterion v0.3.5
   Compiling tokio-tungstenite v0.14.0
   Compiling usdt v0.3.1
   Compiling openapi-lint v0.1.0 (https://github.com/oxidecomputer/openapi-lint?branch=main#ca2947b5)
   Compiling h2 v0.3.9
   Compiling tokio-postgres v0.7.5
   Compiling crucible-protocol v0.0.0 (https://github.com/oxidecomputer/crucible?branch=main#4f02540a)
   Compiling crucible-scope v0.0.0 (https://github.com/oxidecomputer/crucible?branch=main#4f02540a)
   Compiling num_enum v0.5.4
   Compiling usdt v0.2.1
   Compiling typify-macro v0.0.6-dev (https://github.com/oxidecomputer/typify#80b510b0)
   Compiling bhyve_api v0.1.0 (https://github.com/oxidecomputer/propolis?rev=00ec8cf18f6a2311b0907f0b16b0ff8a327944d1#00ec8cf1)
   Compiling dladm v0.1.0 (https://github.com/oxidecomputer/propolis?rev=00ec8cf18f6a2311b0907f0b16b0ff8a327944d1#00ec8cf1)
   Compiling slog-dtrace v0.2.0
   Compiling crucible v0.0.1 (https://github.com/oxidecomputer/crucible?branch=main#4f02540a)
   Compiling hyper v0.14.16
   Compiling propolis v0.1.0 (https://github.com/oxidecomputer/propolis?rev=00ec8cf18f6a2311b0907f0b16b0ff8a327944d1#00ec8cf1)
   Compiling typify v0.0.6-dev (https://github.com/oxidecomputer/typify#80b510b0)
   Compiling progenitor-impl v0.0.0 (https://github.com/oxidecomputer/progenitor#66b41ba3)
   Compiling progenitor-macro v0.0.0 (https://github.com/oxidecomputer/progenitor#66b41ba3)
   Compiling hyper-rustls v0.23.0
   Compiling dropshot v0.6.1-dev (https://github.com/oxidecomputer/dropshot?branch=main#72583242)
   Compiling reqwest v0.11.7
   Compiling polar-core v0.24.0
   Compiling hyper-tls v0.5.0
   Compiling progenitor v0.0.0 (https://github.com/oxidecomputer/progenitor#66b41ba3)
   Compiling omicron-common v0.1.0 (/home/dap/omicron/common)
   Compiling propolis-client v0.1.0 (https://github.com/oxidecomputer/propolis?rev=00ec8cf18f6a2311b0907f0b16b0ff8a327944d1#00ec8cf1)
   Compiling oximeter-db v0.1.0 (/home/dap/omicron/oximeter/db)
   Compiling oximeter-instruments v0.1.0 (/home/dap/omicron/oximeter/instruments)
   Compiling propolis-server v0.1.0 (https://github.com/oxidecomputer/propolis?rev=00ec8cf18f6a2311b0907f0b16b0ff8a327944d1#00ec8cf1)
   Compiling omicron-test-utils v0.1.0 (/home/dap/omicron/test-utils)
   Compiling nexus-client v0.1.0 (/home/dap/omicron/nexus-client)
   Compiling oximeter-client v0.1.0 (/home/dap/omicron/oximeter-client)
   Compiling sled-agent-client v0.1.0 (/home/dap/omicron/sled-agent-client)
   Compiling nexus-test-utils v0.1.0 (/home/dap/omicron/nexus/test-utils)
   Compiling async-bb8-diesel v0.1.0 (https://github.com/oxidecomputer/async-bb8-diesel?rev=c849b717be#c849b717)
   Compiling diesel-dtrace v0.1.0 (https://github.com/oxidecomputer/diesel-dtrace?branch=main#f74c1db5)
   Compiling omicron-sled-agent v0.1.0 (/home/dap/omicron/sled-agent)
   Compiling oximeter-collector v0.1.0 (/home/dap/omicron/oximeter/collector)
   Compiling oximeter-producer v0.1.0 (/home/dap/omicron/oximeter/producer)
   Compiling omicron-package v0.1.0 (/home/dap/omicron/package)
   Compiling oso v0.24.0
    Finished test [unoptimized + debuginfo] target(s) in 4m 20s
     Running unittests (target/debug/deps/db_macros-bb5dcc49c5b724a7)

running 5 tests
test tests::test_derive_metadata_identity_fails_for_enums ... ok
test tests::test_derive_metadata_identity_fails_without_table_name ... ok
test tests::test_derive_metadata_identity_fails_without_embedded_identity ... ok
test tests::test_derive_metadata_identity_fails_with_wrong_table_name_type ... ok
test tests::test_derive_metadata_identity_minimal_example_compiles ... ok

test result: ok. 5 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.01s

     Running unittests (target/debug/deps/omicron_common-90e9358769827c24)

running 19 tests
test api::external::error::test::test_bail_unless ... ok
test api::external::test::test_bytecount ... ok
test api::external::test::test_ip_port_range_from_str ... ok
test api::external::test::test_ip_port_range_into_str ... ok
test api::external::http_pagination::test::test_scan_by_nameid_generic ... ok
test api::external::test::test_name_parse_from_param ... ok
test api::external::test::test_name_parse ... ok
test api::external::test::test_resource_name_parse ... ok
test api::external::http_pagination::test::test_pagination_examples ... ok
test api::external::http_pagination::test::test_scan_by_nameid_name ... ok
test api::external::http_pagination::test::test_scan_by_nameid_id ... ok
test api::external::http_pagination::test::test_scan_by_name ... ok
test api::external::http_pagination::test::test_scan_by_id ... ok
test api::external::test::test_firewall_deserialization ... ok
test config::test::test_example_url ... ok
test config::test::test_bad_url ... ok
test api::external::test::test_role_name_parse ... ok
test api::external::http_pagination::test::test_pagination_schemas ... ok
test api::external::test::test_networktarget_parsing ... ok

test result: ok. 19 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.01s

     Running unittests (target/debug/deps/omicron_nexus-4c6b9fc6ba27886c)

running 52 tests
test authn::external::cookies::test::test_parse_cookies_empty_headers ... ok
test authn::external::cookies::test::test_parse_cookies_two_cookie_headers ... ok
test authn::external::cookies::test::test_parse_cookies_two_cookies ... ok
test authn::external::cookies::test::test_parse_cookies_one_cookie ... ok
test authn::external::cookies::test::test_parse_cookies_two_cookie_headers_same_name ... ok
test authn::external::cookies::test::test_parse_cookies_ignore_other_headers ... ok
test authn::external::session_cookie::test::test_get_token_no_header ... ok
test authn::external::session_cookie::test::test_get_token_other_cookie_present ... ok
test authn::external::session_cookie::test::test_get_token ... ok
test authn::external::session_cookie::test::test_session_cookie_value ... ok
test authn::external::spoof::test::test_spoof_header_missing ... ok
test authn::external::spoof::test::test_spoof_header_valid ... ok
test authn::external::spoof::test::test_spoof_reserved_values ... ok
test authn::external::spoof::test::test_spoof_header_bad_uuids ... ok
test authn::test::test_internal_users ... ok
test config::test::test_config_nonexistent ... ok
test config::test::test_config_bad_toml ... ok
test config::test::test_config_empty ... ok
test authn::external::session_cookie::test::test_other_cookie ... ok
test authn::external::session_cookie::test::test_missing_cookie ... ok
test authn::external::session_cookie::test::test_garbage_cookie ... ok
test authn::external::test::test_authn_sequence ... ok
test authn::external::session_cookie::test::test_expired_cookie_absolute ... ok
test authn::external::session_cookie::test::test_expired_cookie_idle ... ok
test authn::external::session_cookie::test::test_valid_cookie ... ok
test db::fixed_data::test::test_builtin_fleet_id_is_valid ... ok
test db::fixed_data::user_builtin::test::test_builtin_user_ids_are_valid ... ok
test config::test::test_bad_authn_schemes ... ok
test db::collection_insert::test::test_verify_query ... ok
test db::subnet_allocation::test::test_verify_query ... ok
test config::test::test_valid ... ok
test external_api::console_api::test::test_find_file_404_on_directory ... ok
test external_api::console_api::test::test_find_file_404_on_nonexistent ... ok
test external_api::console_api::test::test_find_file_404_on_disallowed_ext ... ok
test external_api::console_api::test::test_find_file_404_on_nonexistent_nested ... ok
test external_api::console_api::test::test_find_file_finds_file ... ok
test external_api::console_api::test::test_find_file_404_on_symlink ... ok
test external_api::console_api::test::test_find_file_wont_follow_symlink ... ok
test db::pagination::test::test_paginated_single_column_ascending ... ok
test db::pagination::test::test_paginated_single_column_descending ... ok
test context::test::test_background_context ... ok
test db::pagination::test::test_paginated_multicolumn_ascending ... ok
test db::pagination::test::test_paginated_multicolumn_descending ... ok
test context::test::test_test_context ... ok
test db::saga_recovery::test::test_successful_saga_does_not_replay_during_recovery ... ok
test db::datastore::test::test_session_methods ... ok
test authz::context::test::test_organization ... ok
test db::saga_recovery::test::test_failure_during_saga_can_be_recovered ... ok
test authz::context::test::test_database ... ok
test db::datastore::test::test_project_creation ... ok
test db::collection_insert::test::test_collection_not_present ... ok
test db::collection_insert::test::test_collection_present ... ok

test result: ok. 52 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 1.14s

     Running unittests (target/debug/deps/nexus-e40f68f4d7783288)

running 0 tests

test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s

     Running tests/test_all.rs (target/debug/deps/test_all-92349072e0ff9019)

running 40 tests
test integration_tests::commands::test_nexus_no_args ... ok
test integration_tests::commands::test_nexus_invalid_config ... ok
test integration_tests::commands::test_nexus_bad_config ... ok
test integration_tests::authn_http::test_authn_spoof_unconfigured ... ok
test integration_tests::authn_http::test_authn_session_cookie ... ok
test integration_tests::authn_http::test_authn_spoof_allowed ... ok
test integration_tests::commands::test_nexus_openapi_internal ... ok
test integration_tests::commands::test_nexus_openapi ... ok
Dec 23 06:19:59.544 ERRO slog-async: logger dropped messages due to channel overflow, count: 341, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.544 ERRO slog-async: logger dropped messages due to channel overflow, count: 1, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.544 ERRO slog-async: logger dropped messages due to channel overflow, count: 41, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.545 ERRO slog-async: logger dropped messages due to channel overflow, count: 121, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.574 ERRO slog-async: logger dropped messages due to channel overflow, count: 113, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.582 ERRO slog-async: logger dropped messages due to channel overflow, count: 59, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.583 ERRO slog-async: logger dropped messages due to channel overflow, count: 155, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.583 ERRO slog-async: logger dropped messages due to channel overflow, count: 61, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.584 ERRO slog-async: logger dropped messages due to channel overflow, count: 63, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.585 ERRO slog-async: logger dropped messages due to channel overflow, count: 17, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.585 ERRO slog-async: logger dropped messages due to channel overflow, count: 4, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.596 ERRO slog-async: logger dropped messages due to channel overflow, count: 1, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.597 ERRO slog-async: logger dropped messages due to channel overflow, count: 1, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.597 ERRO slog-async: logger dropped messages due to channel overflow, count: 52, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.612 ERRO slog-async: logger dropped messages due to channel overflow, count: 129, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.613 ERRO slog-async: logger dropped messages due to channel overflow, count: 5, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.615 ERRO slog-async: logger dropped messages due to channel overflow, count: 37, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.625 ERRO slog-async: logger dropped messages due to channel overflow, count: 293, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.626 ERRO slog-async: logger dropped messages due to channel overflow, count: 9, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.627 ERRO slog-async: logger dropped messages due to channel overflow, count: 20, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.627 ERRO slog-async: logger dropped messages due to channel overflow, count: 30, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.628 ERRO slog-async: logger dropped messages due to channel overflow, count: 27, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.628 ERRO slog-async: logger dropped messages due to channel overflow, count: 1, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
test integration_tests::console_api::test_absolute_static_dir ... ok
Dec 23 06:19:59.633 ERRO slog-async: logger dropped messages due to channel overflow, count: 151, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.640 ERRO slog-async: logger dropped messages due to channel overflow, count: 27, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:19:59.640Dec 23 06:19:59.640  ERRO slog-async: logger dropped messages due to channel overflow, ERROcount :slog-async: logger dropped messages due to channel overflow 215, , producer_idcount::  a6458b7d-87c3-4483-be96-854d814c20de295, , componentproducer_id::  collection-taska6458b7d-87c3-4483-be96-854d814c20de, , collector_idcomponent::  39e6175b-4df2-4730-b11d-cbc1e60a2e78collection-task, , componentcollector_id::  oximeter-agent39e6175b-4df2-4730-b11d-cbc1e60a2e78
, component: oximeter-agent
Dec 23 06:19:59.649 ERRO slog-async: logger dropped messages due to channel overflow, count: 43, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
test integration_tests::datasets::test_dataset_put_bad_zpool_returns_not_found ... ok
test integration_tests::console_api::text_login_form ... ok
test integration_tests::authz::test_authz_basic ... ok
test integration_tests::console_api::test_assets ... ok
test integration_tests::console_api::test_console_pages ... ok
test integration_tests::basic::test_basic_failures ... ok
test integration_tests::console_api::test_sessions ... ok
test integration_tests::basic::test_sleds_list ... ok
Dec 23 06:20:00.018 ERRO slog-async: logger dropped messages due to channel overflow, count: 250, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.071 ERRO slog-async: logger dropped messages due to channel overflow, count: 65, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.073 ERRO slog-async: logger dropped messages due to channel overflow, count: 1, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.074 ERRO slog-async: logger dropped messages due to channel overflow, count: 39, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.074 ERRO slog-async: logger dropped messages due to channel overflow, count: 246, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.076 ERRO slog-async: logger dropped messages due to channel overflow, count: 58, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.077 ERRO slog-async: logger dropped messages due to channel overflow, count: 113, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.077 ERRO slog-async: logger dropped messages due to channel overflow, count: 88, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.080 ERRO slog-async: logger dropped messages due to channel overflow, count: 4, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.081 ERRO slog-async: logger dropped messages due to channel overflow, count: 81, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.082 ERRO slog-async: logger dropped messages due to channel overflow, count: 225, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.085 ERRO slog-async: logger dropped messages due to channel overflow, count: 201, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.085 ERRO slog-async: logger dropped messages due to channel overflow, count: 12, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.086 ERRO slog-async: logger dropped messages due to channel overflow, count: 183, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.086 ERRO slog-async: logger dropped messages due to channel overflow, count: 52, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.087 ERRO slog-async: logger dropped messages due to channel overflow, count: 95, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.087 ERRO slog-async: logger dropped messages due to channel overflow, count: 100, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.087 ERRO slog-async: logger dropped messages due to channel overflow, count: 54, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.088 ERRO slog-async: logger dropped messages due to channel overflow, count: 87, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.088 ERRO slog-async: logger dropped messages due to channel overflow, count: 99, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.089 ERRO slog-async: logger dropped messages due to channel overflow, count: 11, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.090 ERRO slog-async: logger dropped messages due to channel overflow, count: 83, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.090 ERRO slog-async: logger dropped messages due to channel overflow, count: 90, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.091 ERRO slog-async: logger dropped messages due to channel overflow, count: 95, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.091 ERRO slog-async: logger dropped messages due to channel overflow, count: 105, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.092 ERRO slog-async: logger dropped messages due to channel overflow, count: 120, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.092 ERRO slog-async: logger dropped messages due to channel overflow, count: 41, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.092 ERRO slog-async: logger dropped messages due to channel overflow, count: 34, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.093 ERRO slog-async: logger dropped messages due to channel overflow, count: 2, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.093 ERRO slog-async: logger dropped messages due to channel overflow, count: 101, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.093 ERRO slog-async: logger dropped messages due to channel overflow, count: 83, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.095 ERRO slog-async: logger dropped messages due to channel overflow, count: 2, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.095 ERRO slog-async: logger dropped messages due to channel overflow, count: 108, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.096 ERRO slog-async: logger dropped messages due to channel overflow, count: 101, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.096 ERRO slog-async: logger dropped messages due to channel overflow, count: 323, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
test integration_tests::datasets::test_dataset_put_success ... ok
Dec 23 06:20:00.109 ERRO slog-async: logger dropped messages due to channel overflow, count: 64, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.111 ERRO slog-async: logger dropped messages due to channel overflow, count: 505, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.189 ERRO slog-async: logger dropped messages due to channel overflow, count: 13, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.272 ERRO slog-async: logger dropped messages due to channel overflow, count: 162, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.306 ERRO slog-async: logger dropped messages due to channel overflow, count: 700, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.353 ERRO slog-async: logger dropped messages due to channel overflow, count: 459, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.360 ERRO slog-async: logger dropped messages due to channel overflow, count: 461, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.361 ERRO slog-async: logger dropped messages due to channel overflow, count: 28, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.367 ERRO slog-async: logger dropped messages due to channel overflow, count: 923, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.381 ERRO slog-async: logger dropped messages due to channel overflow, count: 10, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.390 ERRO slog-async: logger dropped messages due to channel overflow, count: 195, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:00.433 ERRO slog-async: logger dropped messages due to channel overflow, count: 171
test integration_tests::instances::test_instances_access_before_create_returns_not_found ... ok
test integration_tests::basic::test_projects_basic ... ok
test integration_tests::instances::test_instances_delete_fails_when_running_succeeds_when_stopped ... ok
Dec 23 06:20:01.227 ERRO slog-async: logger dropped messages due to channel overflow, count: 4, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.237 ERRO slog-async: logger dropped messages due to channel overflow, count: 271, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.238 ERRO slog-async: logger dropped messages due to channel overflow, count: 1, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.238 ERRO slog-async: logger dropped messages due to channel overflow, count: 53, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.242 ERRO slog-async: logger dropped messages due to channel overflow, count: 22, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.242 ERRO slog-async: logger dropped messages due to channel overflow, count: 821, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.242 ERRO slog-async: logger dropped messages due to channel overflow, count: 9, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.243 ERRO slog-async: logger dropped messages due to channel overflow, count: 37, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.244 ERRO slog-async: logger dropped messages due to channel overflow, count: 168, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.245 ERRO slog-async: logger dropped messages due to channel overflow, count: 41, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.249 ERRO slog-async: logger dropped messages due to channel overflow, count: 26, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.250 ERRO slog-async: logger dropped messages due to channel overflow, count: 391, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.251 ERRO slog-async: logger dropped messages due to channel overflow, count: 253, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.252 ERRO slog-async: logger dropped messages due to channel overflow, count: 185, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.253 ERRO slog-async: logger dropped messages due to channel overflow, count: 315, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.255 ERRO slog-async: logger dropped messages due to channel overflow, count: 214, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.255 ERRO slog-async: logger dropped messages due to channel overflow, count: 235, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: Dec 23 06:20:01.255collection-task , collector_idERRO:  slog-async: logger dropped messages due to channel overflow39e6175b-4df2-4730-b11d-cbc1e60a2e78, , countcomponent::  427oximeter-agent, 
producer_id:Dec 23 06:20:01.255 ERRO slog-async: logger dropped messages due to channel overflow, count: 11 , a6458b7d-87c3-4483-be96-854d814c20deproducer_id, :component :a6458b7d-87c3-4483-be96-854d814c20de , collection-taskcomponent, :collector_id :collection-task , 39e6175b-4df2-4730-b11d-cbc1e60a2e78collector_id, :component :39e6175b-4df2-4730-b11d-cbc1e60a2e78 , oximeter-agentcomponent
: oximeter-agent
Dec 23 06:20:01.256 ERRO slog-async: logger dropped messages due to channel overflow, count: 1, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, Dec 23 06:20:01.256collector_id : ERRO39e6175b-4df2-4730-b11d-cbc1e60a2e78 , slog-async: logger dropped messages due to channel overflowcomponent, :count :oximeter-agent 
7, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.256Dec 23 06:20:01.256  ERROERRO  slog-async: logger dropped messages due to channel overflowslog-async: logger dropped messages due to channel overflow, , countcount::  12195, , producer_idproducer_id::  a6458b7d-87c3-4483-be96-854d814c20dea6458b7d-87c3-4483-be96-854d814c20de, , componentcomponent::  collection-taskcollection-task, , collector_idcollector_id::  39e6175b-4df2-4730-b11d-cbc1e60a2e7839e6175b-4df2-4730-b11d-cbc1e60a2e78, , componentcomponent::  oximeter-agentoximeter-agent

Dec 23 06:20:01.257Dec 23 06:20:01.257  ERROERRO  slog-async: logger dropped messages due to channel overflowslog-async: logger dropped messages due to channel overflow, , countcount: 117, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.257 ERRO slog-async: logger dropped messages due to channel overflow, count: 19, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.259 ERRO slog-async: logger dropped messages due to channel overflow, count: 158, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
: 121, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-taskDec 23 06:20:01.260,  collector_id:ERRO  39e6175b-4df2-4730-b11d-cbc1e60a2e78slog-async: logger dropped messages due to channel overflow, , componentcount::  oximeter-agent155
, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.260 Dec 23 06:20:01.260ERRO  slog-async: logger dropped messages due to channel overflowERRO,  countslog-async: logger dropped messages due to channel overflow:,  count109:,  producer_id107: a6458b7d-87c3-4483-be96-854d814c20de, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id, :component 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent:
 collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.261 ERRO slog-async: logger dropped messages due to channel overflow, count: 757, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.261 ERRO slog-async: logger dropped messages due to channel overflow, count: 194, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.265 ERRO slog-async: logger dropped messages due to channel overflow, count: 365, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.265 ERRO slog-async: logger dropped messages due to channel overflow, count: 1, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.266 ERRO slog-async: logger dropped messages due to channel overflow, count: 102, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.268 ERRO slog-async: logger dropped messages due to channel overflow, count: 18, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.268 ERRO slog-async: logger dropped messages due to channel overflow, count: 15, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:01.273 ERRO slog-async: logger dropped messages due to channel overflow, count: 43
test integration_tests::oximeter::test_oximeter_database_records ... ok
test integration_tests::instances::test_instances_invalid_creation_returns_bad_request ... ok
test integration_tests::instances::test_instances_create_reboot_halt ... ok
Dec 23 06:20:01.914 ERRO slog-async: logger dropped messages due to channel overflow, count: 21, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
test integration_tests::disks::test_disks ... ok
test integration_tests::basic::test_projects_list ... ok
Dec 23 06:20:02.053 ERRO slog-async: logger dropped messages due to channel overflow, count: 354, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:02.074 ERRO test integration_tests::organizations::test_organizations ... slog-async: logger dropped messages due to channel overflowok
, count: 25, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:02.075 ERRO slog-async: logger dropped messages due to channel overflow, count: 47, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:02.076 ERRO slog-async: logger dropped messages due to channel overflow, count: 165, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:02.078 ERRO slog-async: logger dropped messages due to channel overflow, count: 97, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:02.079 ERRO slog-async: logger dropped messages due to channel overflow, count: 245, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
test integration_tests::users_builtin::test_users_builtin ... ok
test integration_tests::router_routes::test_router_routes ... ok
test integration_tests::projects::test_projects ... ok
test integration_tests::subnet_allocation::test_subnet_allocation ... ok
test integration_tests::roles_builtin::test_roles_builtin ... ok
test integration_tests::vpc_firewall::test_vpc_firewall ... ok
test integration_tests::zpools::test_zpool_put_bad_sled_returns_not_found ... ok
test integration_tests::vpc_routers::test_vpc_routers ... ok
test integration_tests::vpc_subnets::test_vpc_subnets ... ok
test integration_tests::zpools::test_zpool_put_success ... ok
test integration_tests::vpcs::test_vpcs ... ok
test integration_tests::timeseries::test_timeseries_schema ... ok
Dec 23 06:20:05.115 ERRO slog-async: logger dropped messages due to channel overflow, count: 37, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
Dec 23 06:20:05.116 ERRO slog-async: logger dropped messages due to channel overflow, count: 1, producer_id: a6458b7d-87c3-4483-be96-854d814c20de, component: collection-task, collector_id: 39e6175b-4df2-4730-b11d-cbc1e60a2e78, component: oximeter-agent
test integration_tests::oximeter::test_oximeter_reregistration ... ok

test result: ok. 40 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 7.85s

     Running unittests (target/debug/deps/omicron_package-57562b69ed190396)

running 0 tests

test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s

     Running unittests (target/debug/deps/omicron_rpaths-2ca4e60864e6bc23)

running 3 tests
test internal::tests::test_emit_rpath ... ok
test internal::tests::test_configure_from_path ... ok
test internal::tests::test_configure_rpaths_from_bad_envvar - should panic ... ok

test result: ok. 3 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s

     Running unittests (target/debug/deps/omicron_sled_agent-e7cde18ebafae4ba)

running 27 tests
test bootstrap::multicast::test::test_multicast_ipv6 ... ignored
test bootstrap::trust_quorum::rack_secret::tests::secret_splitting_fails_with_threshold_larger_than_total_shares ... ok
test common::instance::test::test_destroy_from_running_stops_first ... ok
test common::instance::test::test_destroy_from_stopped_destroys_immediately ... ok
test common::instance::test::test_reboot ... ok
test common::instance::test::test_reboot_skip_starting_converges_to_running ... ok
test common::instance::test::test_reboot_skip_stopping_converges_to_running ... ok
test common::instance::test::test_running_from_creating ... ok
test illumos::zpool::test::test_parse_zpool ... ok
test illumos::zpool::test::test_parse_zpool_missing_column ... ok
Dec 23 06:20:05.740 INFO instance_ensure e398c5d5-5059-4e55-beac-3a1071083aaa -> InstanceRuntimeStateRequested { run_state: Running }
Dec 23 06:20:05.740 INFO new instance
Dec 23 06:20:05.740 INFO instance_ensure e398c5d5-5059-4e55-beac-3a1071083aaa -> InstanceRuntimeStateRequested { run_state: Running }
Dec 23 06:20:05.740 INFO instance already exists
test bootstrap::spdm::tests::test_recv_timeout ... ok
Dec 23 06:20:05.741 INFO instance_ensure e398c5d5-5059-4e55-beac-3a1071083aaa -> InstanceRuntimeStateRequested { run_state: Running }
Dec 23 06:20:05.741 INFO instance already exists
test instance_manager::test::ensure_instance_repeatedly ... ok
Dec 23 06:20:05.743 INFO instance_ensure e398c5d5-5059-4e55-beac-3a1071083aaa -> InstanceRuntimeStateRequested { run_state: Running }
Dec 23 06:20:05.743 INFO new instance
test instance_manager::test::ensure_instance ... ok
Dec 23 06:20:05.745 INFO Instance::new w/initial HW: InstanceHardware { runtime: InstanceRuntimeState { run_state: Creating, sled_uuid: 5f1d3a89-d54e-4ad1-962d-c3934bda2de1, propolis_uuid: ed895b13-55d5-4e0b-88e9-3f4e74d0d936, ncpus: InstanceCpuCount(2), memory: ByteCount(536870912), hostname: "myvm", gen: Generation(1), time_updated: 2021-12-23T14:20:05.745295901Z }, nics: [] }
Dec 23 06:20:05.745 INFO transition to InstanceRuntimeStateRequested { run_state: Running }; action: Run, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.745 INFO Taking action: Run, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
test sim::collection::test::test_sim_disk_transition_to_detached_states ... ok
test sim::collection::test::test_sim_instance_creating_to_stop ... ok
test instance::test::transition_before_start - should panic ... ok
Dec 23 06:20:05.747 INFO Instance::new w/initial HW: InstanceHardware { runtime: InstanceRuntimeState { run_state: Creating, sled_uuid: a42ae7d3-0026-42bb-950a-8ba2414ccf0f, propolis_uuid: ed895b13-55d5-4e0b-88e9-3f4e74d0d936, ncpus: InstanceCpuCount(2), memory: ByteCount(536870912), hostname: "myvm", gen: Generation(1), time_updated: 2021-12-23T14:20:05.747043900Z }, nics: [] }
test sim::collection::test::test_sim_disk_attach_then_fault ... ok
Dec 23 06:20:05.747 INFO Configured propolis zone: oxz_propolis_instance_ed895b13-55d5-4e0b-88e9-3f4e74d0d936, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.747 INFO Cloned child zone: oxz_propolis_instance_ed895b13-55d5-4e0b-88e9-3f4e74d0d936, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.747 INFO Booted zone: oxz_propolis_instance_ed895b13-55d5-4e0b-88e9-3f4e74d0d936, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.747 INFO Network milestone ready for oxz_propolis_instance_ed895b13-55d5-4e0b-88e9-3f4e74d0d936, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.747 INFO Created address 127.0.0.1/24 for zone: oxz_propolis_instance_ed895b13-55d5-4e0b-88e9-3f4e74d0d936, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.747 INFO Started propolis in zone: oxz_propolis_instance_ed895b13-55d5-4e0b-88e9-3f4e74d0d936, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
test sim::collection::test::test_sim_instance_running_then_destroyed ... ok
test sim::collection::test::test_sim_disk_attach_then_destroy ... ok
test sim::collection::test::test_sim_instance_preempt_transition ... ok
test bootstrap::trust_quorum::share_distribution::tests::write_and_read ... ok
Dec 23 06:20:05.755 INFO GET request to http://127.0.0.1:12400/instances/00000000-0000-0000-0000-000000000000, propolis_client address: 127.0.0.1:12400, component: propolis-client, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
test bootstrap::spdm::requester::tests::negotiation ... ok
Dec 23 06:20:05.756 WARN failed to await http server (Request failed: error sending request for url (http://127.0.0.1:12400/instances/00000000-0000-0000-0000-000000000000): error trying to connect: tcp connect error: Connection refused (os error 146)), will retry in 188.247084ms, error: Reqwest(reqwest::Error { kind: Request, url: Url { scheme: "http", cannot_be_a_base: false, username: "", password: None, host: Some(Ipv4(127.0.0.1)), port: Some(12400), path: "/instances/00000000-0000-0000-0000-000000000000", query: None, fragment: None }, source: hyper::Error(Connect, ConnectError("tcp connect error", Os { code: 146, kind: ConnectionRefused, message: "Connection refused" })) }), instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.758 INFO listening, local_addr: 127.0.0.1:12400
test bootstrap::trust_quorum::server::test::send_share ... ok
test sim::collection::test::test_sim_instance_reboot ... ok
test bootstrap::trust_quorum::rack_secret::tests::create_and_verify ... ok
test bootstrap::trust_quorum::rack_secret::tests::combine_deserialized_shares ... ok
Dec 23 06:20:05.946 INFO GET request to http://127.0.0.1:12400/instances/00000000-0000-0000-0000-000000000000, propolis_client address: 127.0.0.1:12400, component: propolis-client, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.946 INFO accepted connection, remote_addr: 127.0.0.1:36089, local_addr: 127.0.0.1:12400
Dec 23 06:20:05.947 INFO request completed, error_message_external: Internal Server Error, error_message_internal: No matching instance, response_code: 500, uri: /instances/00000000-0000-0000-0000-000000000000, method:Dec 23 06:20:05.947  GET, INFOreq_id :Sending ensure request to propolis: InstanceEnsureRequest { properties: InstanceProperties { id: ed895b13-55d5-4e0b-88e9-3f4e74d0d936, name: "myvm", description: "Test description", image_id: 00000000-0000-0000-0000-000000000000, bootrom_id: 00000000-0000-0000-0000-000000000000, memory: 512, vcpus: 2 }, nics: [], disks: [] } , 411aa102-430f-426d-a27c-0b161b1d32bfinstance id, :remote_addr :e398c5d5-5059-4e55-beac-3a1071083aaa 
127.0.0.1:36089, Dec 23 06:20:05.947local_addr : INFO127 .0.PUT request to http://127.0.0.1:12400/instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d9360.1:, 12400propolis_client address
: 127.0.0.1:12400, component: propolis-client, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.948 INFO request completed, response_code: 201, uri: /instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936, method: PUT, req_id: 1381aa8f-f404-475c-9b01-72f631ce2848, remote_addr: 127.0.0.1:36089, local_addr: 127.0.0.1:12400
Dec 23 06:20:05.948 INFO GET request to http://127.0.0.1:12400/instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/state-monitor, propolis_client address: 127.0.0.1:12400, component: propolis-client, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.948 INFO request completed, response_code: 200, uri: /instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/state-monitor, method: GET, req_id: 7d2288ac-15c4-46ca-b7ca-3c4028af7007, remote_addr: 127.0.0.1:36089, local_addr: 127.0.0.Dec 23 06:20:05.9491:12400 
INFO Observing new propolis state: Creating, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.949 INFO New state: Creating, action: None, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.949 INFO GET request to http://127.0.0.1:12400/instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/state-monitor, propolis_client address: 127.0.0.1:12400, component: propolis-client, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.949 INFO transition to InstanceRuntimeStateRequested { run_state: Running }; action: Run, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.950 Dec 23 06:20:05.950INFO  INFOaccepted connection , Taking action: Runremote_addr, :instance id :127.0.0.1:63927 , e398c5d5-5059-4e55-beac-3a1071083aaalocal_addr
: 127.0.0.1Dec 23 06:20:05.950:12400 
INFO PUT request to http://127.0.0.1:12400/instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/state, propolis_client address: 127.0.0.1:12400, component: propolis-client, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.950 INFO request completed, response_code: 204, uri: /instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/state, method: PUT, req_id: ca45d7ed-b20d-4478-affa-c83f317b571c, remote_addr: 127.0.0.1:63927, local_addr: 127.0.0.1:12400
Dec 23 06:20:05.950 INFO request completed, response_code: 200, uri: /instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/state-monitor, method: GET, req_id: d058a2be-b324-4b13-b58a-a61c5cc67f9c, remote_addr: 127.0.0.1:36089, local_addr: 127.0.0.1:12400
Dec 23 06:20:05.950 INFO Observing new propolis state: Running, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.951 INFO New state: Running, action: None, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.951 INFO GET request to http://127.0.0.1:12400/instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/state-monitor, propolis_client address: 127.0.0.1:12400, component: propolis-client, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.951 INFO transition to InstanceRuntimeStateRequested { run_state: Stopped }; action: Stop, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.951 INFO Taking action: Stop, instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
Dec 23 06:20:05.951 Dec 23 06:20:05.951INFO  PUT request to http://127.0.0.1:12400/instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/stateINFO,  propolis_client addressrequest completed:,  response_code127.0.0.1:12400:,  component204: , propolis-clienturi, :instance id :/instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/state , e398c5d5-5059-4e55-beac-3a1071083aaamethod
: PUT, req_id: 257a202d-1960-49a6-98ca-23fad91643fd, remote_addr: Dec 23 06:20:05.952127 .0.INFO0. 1:Observing new propolis state: Destroyed36089, , instance idlocal_addr::  e398c5d5-5059-4e55-beac-3a1071083aaa127
.0.0.1:12400Dec 23 06:20:05.952 
INFO New state: Stopped, action: Some(Destroy), instance id: Dec 23 06:20:05.952e398c5d5-5059-4e55-beac-3a1071083aaa 
INFODec 23 06:20:05.952  request completedINFO , Taking action: Destroyresponse_code, :instance id :200 , e398c5d5-5059-4e55-beac-3a1071083aaauri
: Dec 23 06:20:05.952/instances/ed895b13-55d5-4e0b-88e9-3f4e74d0d936/state-monitor , INFOmethod :take_action: Taking the Destroy action , GETinstance id, :req_id :e398c5d5-5059-4e55-beac-3a1071083aaa 
4914cb3b-288d-49fb-be33-fc6c0b8f9168, Dec 23 06:20:05.952remote_addr :WARN  127Halting and removing zone: oxz_propolis_instance_ed895b13-55d5-4e0b-88e9-3f4e74d0d936.0, .0instance id.1::63927 , e398c5d5-5059-4e55-beac-3a1071083aaalocal_addr
: Dec 23 06:20:05.953127 .0.INFO0. 1:State monitoring task complete12400, 
instance id: e398c5d5-5059-4e55-beac-3a1071083aaa
test instance::test::start_then_stop ... ok

test result: ok. 26 passed; 0 failed; 1 ignored; 0 measured; 0 filtered out; finished in 0.22s

     Running unittests (target/debug/deps/sled_agent-42a856d69b5ea236)

running 0 tests

test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s

     Running unittests (target/debug/deps/sled_agent_overlay_files-18fd0631ed22e4f3)

running 0 tests

test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s

     Running unittests (target/debug/deps/sled_agent_sim-b1c4fbeb91e7a4e1)

running 0 tests

test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s

     Running tests/test_commands.rs (target/debug/deps/test_commands-14036e0356174d82)

running 4 tests
test test_sled_agent_no_args ... ok
test test_sled_agent_sim_no_args ... ok
test test_sled_agent_openapi_bootagent ... ok
test test_sled_agent_openapi_sled ... ok

test result: ok. 4 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.04s

     Running unittests (target/debug/deps/omicron_test_utils-8a5efcdf130a2dbd)

running 13 tests
test dev::clickhouse::tests::test_clickhouse_in_path ... ok
test dev::db::test::test_make_pg_config_ok ... ok
test dev::db::test::test_make_pg_config_fail ... ok
test dev::clickhouse::tests::test_discover_local_listening_port_timeout ... ok
test dev::clickhouse::tests::test_discover_local_listening_port ... ok
test dev::db::test::test_starter_tmpdir ... ok
test dev::clickhouse::tests::test_discover_local_listening_port_slow_write ... ok
test dev::db::test::test_bad_cmd ... ok
test dev::db::test::test_database_start_hang ... ok
test dev::db::test::test_cmd_fails ... ok
test dev::db::test::test_setup_database_overridden_dir ... ok
test dev::db::test::test_database_concurrent ... ok
test dev::db::test::test_setup_database_default_dir ... ok

test result: ok. 13 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 4.61s

     Running unittests (target/debug/deps/omicron_dev-f0ec8fcf380dd98e)

running 0 tests

test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s

     Running tests/test_omicron_dev.rs (target/debug/deps/test_omicron_dev-b3896961082d1e35)

running 6 tests
test test_omicron_dev_db_wipe_no_args ... ok
test test_omicron_dev_db_populate_no_args ... ok
test test_omicron_dev_bad_cmd ... ok
test test_omicron_dev_no_args ... ok
test test_db_run ... FAILED
test test_db_killed ... ok

failures:

---- test_db_run stdout ----
will run: bash -c '( set -o monitor; /home/dap/omicron/target/debug/omicron-dev db-run --listen-port 0)'
waiting for stdout from child process
found temporary directory: /dangerzone/omicron_tmp3/.tmpV6Hc4O
found database pid: 13428
found postgres listen URL: postgresql://root@127.0.0.1:60619/omicron?sslmode=disable
thread 'test_db_run' panicked at 'unexpected EOF from child process stdout', test-utils/tests/test_omicron_dev.rs:76:17
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace


failures:
    test_db_run

test result: FAILED. 5 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 2.34s

error: test failed, to rerun pass '-p omicron-test-utils --test test_omicron_dev'

real    28m58.797s
user    239m39.684s
sys     42m51.236s
5

I believe this "unexpected EOF" is one of the failures we saw in CI when Sean and I looked last Friday. (Unlike a lot of what I was chasing yesterday, which I'm afraid were unrelated self-inflicted issues.) Unfortunately, the directory doesn't exist:

$ ls -ld /dangerzone/omicron_tmp3/.tmpV6Hc4O
/dangerzone/omicron_tmp3/.tmpV6Hc4O: No such file or directory

That's even though I'm building from a5713c6, which includes #545. In this case, we don't know why omicron-dev is exiting early. We should probably save more stdout/stderr from it.

@davepacheco
Copy link
Collaborator

I ran another loop that failed after 16 iterations in 82m with the error that @ahl originally saw:

failures:

---- db::pagination::test::test_paginated_single_column_descending stdout ----
log file: "/dangerzone/omicron_tmp3/omicron_nexus-4c6b9fc6ba27886c-test_paginated_single_column_descending.29507.10.log"
note: configured to log to "/dangerzone/omicron_tmp3/omicron_nexus-4c6b9fc6ba27886c-test_paginated_single_column_descending.29507.10.log"
thread 'db::pagination::test::test_paginated_single_column_descending' panicked at 'called `Result::unwrap()` on an `Err` value: Exited', /home/dap/omicron/test-utils/src/dev/mod.rs:143:42
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace


failures:
    db::pagination::test::test_paginated_single_column_descending

test result: FAILED. 51 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 1.00s

error: test failed, to rerun pass '-p omicron-nexus --lib'

$ bunyan "/dangerzone/omicron_tmp3/omicron_nexus-4c6b9fc6ba27886c-test_paginated_single_column_descending.29507.10.log"
[2021-12-23T12:21:50.701423666-08:00]  INFO: test_paginated_single_column_descending/29507 on ivanova: cockroach temporary directory: /dangerzone/omicron_tmp3/.tmpdnKGbp
[2021-12-23T12:21:50.702063553-08:00]  INFO: test_paginated_single_column_descending/29507 on ivanova: cockroach: copying from seed directory (/dangerzone/omicron_tmp3/crdb-base) to storage directory (/dangerzone/omicron_tmp3/.tmpdnKGbp/data)
[2021-12-23T12:21:50.717789211-08:00]  INFO: test_paginated_single_column_descending/29507 on ivanova: cockroach command line: cockroach start-single-node --insecure --http-addr=:0 --store /dangerzone/omicron_tmp3/.tmpdnKGbp/data --listen-addr 127.0.0.1:0 --listening-url-file /dangerzone/omicron_tmp3/.tmpdnKGbp/listen-url

Fortunately, we now have the database directory, which includes CockroachDB's stderr:

$ ls -l /dangerzone/omicron_tmp3/.tmpdnKGbp
total 7
-rw-r--r--   1 dap      staff       1186 Dec 23 12:21 cockroachdb_stderr
-rw-r--r--   1 dap      staff          0 Dec 23 12:21 cockroachdb_stdout
drwxr-xr-x   5 dap      staff         19 Dec 23 12:21 data/
$ cat /dangerzone/omicron_tmp3/.tmpdnKGbp/cockroachdb_stderr
*
* WARNING: ALL SECURITY CONTROLS HAVE BEEN DISABLED!
* 
* This mode is intended for non-production testing only.
* 
* In this mode:
* - Your cluster is open to any client that can access 127.0.0.1.
* - Intruders with access to your machine or network can observe client-server traffic.
* - Intruders can log in without password and read or write any data in the cluster.
* - Intruders can consume all your server's resources and cause unavailability.
*
*
* INFO: To start a secure server without mandating TLS for clients,
* consider --accept-sql-without-tls instead. For other options, see:
* 
* - https://go.crdb.dev/issue-v/53404/v21.1
* - https://www.cockroachlabs.com/docs/v21.1/secure-a-cluster.html
*
*
* ERROR: ERROR: could not cleanup temporary directories from record file: could not lock temporary directory /dangerzone/omicron_tmp3/crdb-base/cockroach-temp900142307, may still be in use: resource temporarily unavailable
*
ERROR: could not cleanup temporary directories from record file: could not lock temporary directory /dangerzone/omicron_tmp3/crdb-base/cockroach-temp900142307, may still be in use: resource temporarily unavailable
Failed running "start-single-node"

This looks to me like a variant of the issue Sean mentioned above. In his example, one cockroach instance was started from a copy of the seed and it reached into the seed and removed a directory that a concurrently-running test was trying to copy. In this case, the victim was a second cockroach process that presumably tried to follow its own link only to find it locked. I'm hopeful that #553 will resolve this case.

@davepacheco
Copy link
Collaborator

After another 6 iterations in 33m, I ran into another one:

failures:

---- db::collection_insert::test::test_collection_not_present stdout ----
log file: "/dangerzone/omicron_tmp3/omicron_nexus-4c6b9fc6ba27886c-test_collection_not_present.15713.3.log"
note: configured to log to "/dangerzone/omicron_tmp3/omicron_nexus-4c6b9fc6ba27886c-test_collection_not_present.15713.3.log"
thread 'db::collection_insert::test::test_collection_not_present' panicked at 'called `Result::unwrap()` on an `Err` value: Exited', /home/dap/omicron/test-utils/src/dev/mod.rs:143:42
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace


failures:
    db::collection_insert::test::test_collection_not_present

test result: FAILED. 51 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.97s

error: test failed, to rerun pass '-p omicron-nexus --lib'

Same failure mode:

dap@ivanova omicron $ bunyan "/dangerzone/omicron_tmp3/omicron_nexus-4c6b9fc6ba27886c-test_collection_not_present.15713.3.log"
[2021-12-23T13:27:54.634863443-08:00]  INFO: test_collection_not_present/15713 on ivanova: cockroach temporary directory: /dangerzone/omicron_tmp3/.tmpAEYv3f
[2021-12-23T13:27:54.635823478-08:00]  INFO: test_collection_not_present/15713 on ivanova: cockroach: copying from seed directory (/dangerzone/omicron_tmp3/crdb-base) to storage directory (/dangerzone/omicron_tmp3/.tmpAEYv3f/data)
[2021-12-23T13:27:54.661521060-08:00]  INFO: test_collection_not_present/15713 on ivanova: cockroach command line: cockroach start-single-node --insecure --http-addr=:0 --store /dangerzone/omicron_tmp3/.tmpAEYv3f/data --listen-addr 127.0.0.1:0 --listening-url-file /dangerzone/omicron_tmp3/.tmpAEYv3f/listen-url
dap@ivanova omicron $ ls -l /dangerzone/omicron_tmp3/.tmpAEYv3f
total 7
-rw-r--r--   1 dap      staff       1186 Dec 23 13:27 cockroachdb_stderr
-rw-r--r--   1 dap      staff          0 Dec 23 13:27 cockroachdb_stdout
drwxr-xr-x   5 dap      staff         19 Dec 23 13:27 data/
dap@ivanova omicron $ cat /dangerzone/omicron_tmp3/.tmpAEYv3f/cockroachdb_stderr 
*
* WARNING: ALL SECURITY CONTROLS HAVE BEEN DISABLED!
* 
* This mode is intended for non-production testing only.
* 
* In this mode:
* - Your cluster is open to any client that can access 127.0.0.1.
* - Intruders with access to your machine or network can observe client-server traffic.
* - Intruders can log in without password and read or write any data in the cluster.
* - Intruders can consume all your server's resources and cause unavailability.
*
*
* INFO: To start a secure server without mandating TLS for clients,
* consider --accept-sql-without-tls instead. For other options, see:
* 
* - https://go.crdb.dev/issue-v/53404/v21.1
* - https://www.cockroachlabs.com/docs/v21.1/secure-a-cluster.html
*
*
* ERROR: ERROR: could not cleanup temporary directories from record file: could not lock temporary directory /dangerzone/omicron_tmp3/crdb-base/cockroach-temp859679150, may still be in use: resource temporarily unavailable
*
ERROR: could not cleanup temporary directories from record file: could not lock temporary directory /dangerzone/omicron_tmp3/crdb-base/cockroach-temp859679150, may still be in use: resource temporarily unavailable
Failed running "start-single-node"

I will probably put this aside for the time being and pick up trying to reproduce it in the new year once #553 is landed. Alternatively, at that point I think it would be fair to wait until we hit this organically in CI again, now that we expect to have more debugging information in the CI artifacts.

@davepacheco
Copy link
Collaborator

Some hopeful news: with commit 3d266ba (from #553), after 63 iterations over 5h18m, the test suite has passed every time. That's much longer than it went before.

smklein added a commit that referenced this issue Dec 27, 2021
This should fix #540 by ensuring that no "temporary files" pointing to the seed directory exist after shutdown.

Additionally, adds an assertion that the temp-dirs-record.txt file is empty (as suggested in cockroachdb/cockroach#74231)
@smklein
Copy link
Collaborator

smklein commented Dec 27, 2021

With #553 merged, I'm considering this closed. Please feel free to re-open if we see any other flake related to DB setup / teardown / access!

@davepacheco
Copy link
Collaborator

In retrospect, the "found pointer to free object" failure looks similar to the one reported under #1223.

smklein added a commit that referenced this issue Oct 27, 2024
This one has some history!

While working on some database-related APIs, I noticed that my tests
using CockroachDB were a **lot** slower than other tests. To some
degree, I expect this, but also, this was on the order of ~5-10 seconds
per test doing very little other than CockroachDB setup and teardown.

After doing a little profiling, I noticed that tests took several
seconds to perform teardown, which increases significantly if any schema
changes occurred.

Why does teardown take so long? Well, it turns out we are sending
`SIGTERM` to the CockroachDB process to gracefully terminate it, instead
of `SIGKILL`, which would terminate it much more abruptly.

This is where the history comes in: Gracefully terminating CockroachDB
was a choice we made a few years ago to avoid a test flake:
#540. Basically, when
creating the "seed database" -- a database where we apply schema changes
that are shared between many tests -- we want to gracefully terminate to
avoid leaving the database in a "dirty state", where it might need to
flush work and cleanup intermediate state. In the case of #540, that
"dirty intermediate state" was an absolute path, which meant copies of
that seed database trampled on each other if graceful shutdown did not
complete.

Our approach was to apply graceful termination to all CockroachDB
teardown invocations, but this was overkill.
Only the seed database expects to have storage be in-use after the call
to `cleanup` -- all other test-only invocations expect to immediately
remove their storage. They don't need to terminate gracefully, and
arguably, should just exit as quickly as they can.

This PR changes the disposition:
- `cleanup_gracefully` uses `SIGTERM`, and waits for graceful cleanup.
This is still used when constructing the seed db.
- `cleanup` uses `SIGKILL`, and kills the database immediately. This is
now used for all other use-cases.

As an example in the performance difference, here's a comparison for
some datastore tests:

## Before

```
  SETUP PASS [      1/1] crdb-seed: cargo run -p crdb-seed --profile test
        PASS [   6.996s] nexus-db-queries db::datastore::db_metadata::test::ensure_schema_is_current_version
        PASS [   7.344s] nexus-db-queries db::datastore::db_metadata::test::schema_version_subcomponents_save_progress
        PASS [   8.609s] nexus-db-queries db::datastore::db_metadata::test::concurrent_nexus_instances_only_move_forward
------------
     Summary [  11.386s] 3 tests run: 3 passed, 228 skipped
```

## After

```
  SETUP PASS [      1/1] crdb-seed: cargo run -p crdb-seed --profile test
        PASS [   2.087s] nexus-db-queries db::datastore::db_metadata::test::ensure_schema_is_current_version
        PASS [   3.054s] nexus-db-queries db::datastore::db_metadata::test::schema_version_subcomponents_save_progress
        PASS [   4.442s] nexus-db-queries db::datastore::db_metadata::test::concurrent_nexus_instances_only_move_forward
------------
     Summary [   7.550s] 3 tests run: 3 passed, 228 skipped
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Test Flake Tests that work. Wait, no. Actually yes. Hang on. Something is broken. Testing & Analysis Tests & Analyzers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants