-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deprecate json_import and json_export and switch testing to databroker-pack #479
Comments
Add the unpack in the GitHub Actions workflow |
We've done some of this in |
That's a good example of a problem this project may also face. Likely it is how GitHub Actions provides a working directory. Perhaps the YAML file is placed in a directory that GitHub discards after pytest finishes? |
The |
In from apstools.utils import json_export, json_import
def get_db(json_file, zip_file):
import databroker
db = databroker.temp()
datasets = json_import(json_file, zip_file)
insert_docs(db, datasets)
return db
def insert_docs(db, datasets, verbose=False):
db = db.v1
for i, h in enumerate(datasets):
if verbose:
print(f"{i+1}/{len(datasets)} : {len(h)} documents")
for k, doc in h:
db.insert(k, doc) Then, in that same directory: dcat = get_db("data.json", "bluesky_data.zip")
from apstools.utils import listruns
listruns(db=dcat)
catalog name: temp
========= ========================== ======= ======= ========================================
short_uid date/time exit scan_id command
========= ========================== ======= ======= ========================================
3e89a55 2019-05-24 10:47:11.731741 success 131 count(detectors=['adsimdet'], num=1)
2edf5d0 2019-04-12 12:58:41.239802 success 2 scan(detectors=['noisy_det'], num=8, ...
ffb80ba 2019-05-06 15:03:00.163182 success 102 count(detectors=['noisy'], num=100)
0e8188e 2019-05-06 15:02:56.365410 success 101 count(detectors=['noisy'], num=100)
a729093 2019-05-06 16:39:06.248241 success 127 count(detectors=['scaler'], num=1)
67b7ef3 2019-05-06 15:22:21.472708 success 107 count(detectors=['scaler', 'noisy'], ...
22db858 2019-05-06 15:02:51.853075 success 100 count(detectors=['noisy'], num=100)
f0a39ab 2019-04-11 16:05:49.970579 success 1 scan(detectors=['noisy_det'], num=8, ...
0a87c46 2019-05-06 15:25:21.014055 success 109 count(detectors=['scaler'], num=5)
7ef69ea 2019-04-11 15:51:39.778569 success 2 scan(detectors=['noisy_det'], num=8, ...
4d41c06 2019-04-11 15:59:06.373829 success 1 scan(detectors=['noisy_det'], num=8, ...
837ffac 2019-05-06 15:02:46.729177 success 99 count(detectors=['noisy'], num=100)
50cd05b 2019-04-11 16:05:50.739003 success 3 scan(detectors=['noisy_det'], num=8, ...
64d4ed4 2019-05-06 15:59:29.680473 success 113 count(detectors=['scaler_channels_ch ...
75f68f4 2019-05-06 15:01:16.722168 success 94 count(detectors=['noisy'], num=10)
389cf14 2019-04-12 12:58:40.786560 success 1 scan(detectors=['noisy_det'], num=8, ...
bb7e048 2019-04-12 12:59:16.131039 success 2 scan(detectors=['noisy_det'], num=8, ...
616de31 2019-04-12 10:25:34.788890 success 2 scan(detectors=['noisy_det'], num=8, ...
9af10cf 2019-05-06 15:01:25.681316 success 95 count(detectors=['noisy'], num=100)
2551749 2019-04-12 10:25:34.315701 success 1 scan(detectors=['noisy_det'], num=8, ...
========= ========================== ======= ======= ======================================== |
Similar for the USAXS test data: import json
import zipfile
def get_test_data(json_file, zip_file):
"""get document streams as dict from zip file"""
with zipfile.ZipFile(zip_file, "r") as fp:
buf = fp.read(json_file).decode("utf-8")
return json.loads(buf)
objs = get_test_data("usaxs_docs.json.txt", "usaxs_docs.json.zip")
ucat = databroker.temp()
insert_docs(ucat, objs.values(), verbose=True)
print(f"{ucat.v2.name = }")
print(f"{len(ucat.v2) = }")
listruns(db=ucat) with this output: 1/10 : 4 documents
2/10 : 7 documents
3/10 : 7 documents
4/10 : 37 documents
5/10 : 7 documents
6/10 : 37 documents
7/10 : 41 documents
8/10 : 27 documents
9/10 : 37 documents
10/10 : 7 documents
ucat.v2.name = 'temp'
len(ucat.v2) = 10
catalog name: temp
========= ========================== ======= ======= ========================================
short_uid date/time exit scan_id command
========= ========================== ======= ======= ========================================
2ffe4d8 2019-05-02 17:45:33.937294 success 108 tune_mr()
3554003 2019-05-02 15:38:37.612823 success 103 tune_ar()
fdf496e 2019-04-23 14:52:04.605015 success 27 run_Excel_file()
1996598 2019-05-02 17:48:29.729382 success 110 Flyscan(pos_X=60, pos_Y=160, thickne ...
e5d2cbd 2019-05-02 18:17:58.932330 success 1 snapshot()
6cfeb21 2019-05-02 15:38:30.190181 success 102 tune_m2rp()
ddffefc 2019-05-02 17:48:20.934118 success 109 measure_USAXS_Transmission(detectors ...
99fe9e0 2019-04-23 16:09:54.520233 success 2 TuneAxis.tune()
b0aa643 2019-05-02 15:38:56.536864 success 104 tune_a2rp()
555a604 2019-05-02 16:53:31.423197 success 2 count(detectors=['scaler0'], num=1)
========= ========================== ======= ======= ======================================== |
Then, created catalog configuration YAML files with steps such as:
|
Now have these catalogs available:
|
Pack each using:
then make |
Refactor the unit tests to use
databroker-pack
anddatabroker-unpack
.Originally posted by @prjemian in #475 (comment)
The text was updated successfully, but these errors were encountered: