-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cargo sqlx prepare expecting DATABASE_URL in workspace scenario #1223
Comments
I have been looking into fixing this and found the following mechanics;
The caveat is that, in all that process it also acceses DATABASE_URL to determine the type of database to then add that at the top level object as Question: What is that being used for? Can I safely bypass it in the case of I have manually merged those files with a small python script, omitting the Insights to the implications of the things I'm touching upon by a developer of this project would be greatly appreciated. Fumbling along the surface is very different from having a deeper understanding of the underpinning motivations and reasons of them. |
Ok so the The only workaround I can think of is to have individual Pointers to where to start to save time? |
For the sake of completeness. A current workaround:
import json
import glob
glob.glob("target/sqlx/*")
data = [json.load(open(path, "r")) for path in glob.glob("target/sqlx/query-*.json")]
merged = {v["hash"]: v for v in data}
json.dump(merged, open("sqlx-data.json", "w"), indent=4) The checking against offline for individual subcrates just works. Its the |
I just ran into this same issue, and adopted your script. Here's what I now committed as #!/usr/bin/env python3
import json
import glob
data = [json.load(open(path, "r")) for path in glob.glob("target/sqlx/query-*.json")]
merged = { "db": "PostgreSQL", **{ v["hash"]: v for v in data } }
json.dump(merged, open("sqlx-data.json", "w"), indent=4) There isn't actually a need to copy |
Updated script that fully replaces #!/usr/bin/env python3
import json
import glob
import os
import shutil
import subprocess
shutil.rmtree("target/sqlx", ignore_errors=True)
os.environ["SQLX_OFFLINE"] = "false"
subprocess.run(["cargo", "check", "--workspace"])
data = [json.load(open(path, "r")) for path in glob.glob("target/sqlx/query-*.json")]
merged = { "db": "PostgreSQL", **{ v["hash"]: v for v in data } }
json.dump(merged, open("sqlx-data.json", "w"), indent=4) |
Here is a shell script using #!/usr/bin/env bash
rm -rf target/sqlx
touch crates/*/src/*.rs
env -u DATABASE_URL SQLX_OFFLINE=false cargo check --workspace
jq -s '{"db": "MySQL"} + INDEX(.hash)' target/sqlx/query-*.json > sqlx-data.json |
Thanks @arlyon, for reproducibility I think it's a good idea to make sure the hashes are sorted. It also reduces the probability of useless VCS diffs.
I can live with that workaround, but that's still a pretty big hack. 😞 |
This doesn't seem to be working for me? |
Same. I can't manually locate the Did you find a solution ? |
I did not test this further, however, I found the following. The |
Hello, I wanted to make sure I understand correctly, doesn't sqlx support a workspace with crates that use different databases? |
The issue as I understand is when running I get around this issue by running |
You can also solve this by loading in the |
When generating sqlx-data.json offline information for CI/CD within a workspace with subcrates for multiple databases containing their respective and valid .env information, it expects DATABASE_URL to be set.
cargo check
works as expected. Macros fetching their respective DATABASE_URL from workspace subcrate.env
.Expected behavior would be for
cargo sqlx prepare
to do the same at workspace root. Alternatively to not confuse the different DATABASE_URL when generating them independently in their respective subcrates for separation.cargo sqlx prepare --merged
, if supposed to enable workspacesqlx-data.json
information merging, does not change this behaviour.Spun out of #121
The text was updated successfully, but these errors were encountered: