-
-
Notifications
You must be signed in to change notification settings - Fork 621
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG]: jsonb always inserted as a json string when using postgres-js. #724
Comments
I was working on a fix there, its kinda stale as would need to have different behavior for pg as it expects arrays to be passed as a string #666 |
same issue here |
same issue here +1 |
+1 - json/jsonb shouldn't be listed in the docs as a supported type when this bug exists. My workaround was to use a db.execute and pass objects into the sql`` template. const statement = sql`
INSERT INTO wikidata_article (wikidata_id, category, grade, en_raw_length, en_url_title, labels, sitelinks)
VALUES (${articleDetails.wikiDataId}, ${articleDetails.category}, ${articleDetails.grade}, ${articleDetails.enBytes}, ${articleDetails.enUrlTitle}, ${articleDetails.labels}, ${articleDetails.sitelinks})
ON CONFLICT (wikidata_id) DO UPDATE
SET
category = ${articleDetails.category},
grade = ${articleDetails.grade},
en_raw_length = ${articleDetails.enBytes},
en_url_title = ${articleDetails.enUrlTitle},
labels = ${articleDetails.labels},
sitelinks = ${articleDetails.sitelinks}
`;
await db.execute(statement); |
same issue. :( |
This works well as a temporary solution! You could also do a hybrid approach where you call ...
await db
.insert(table)
.values({
id: id,
...
})
... and then call await db.execute(
sql`UPDATE table set property = ${property} WHERE id = ${id}`
) afterward to handle the value that's jsonb. Note that because this would be two separate calls, I would only recommend this if you're inserting a lot of columns at once/constantly changing your schema and want to maximize type safety (if you rename your columns in |
Is not arbitrary string failing to be detected as json/b in postgres? I've stumbled over this today as I've realized that somehow all values are escaped and the entire object is treated as a scalar. I have this workaround with
Looking at the complexity of postgres json handling, one can appreciate drizzle design to go down to plain sql/functions without a need to have it abstracted or non-existent as other "ORMs". I end up doing something like this:
|
I too was able to bypass the issue with this bug by wrapping my values call with sql``: const product = await tx
.insert(products)
.values({
entityId: input.entityId,
eventType: input.eventType,
payload: sql`${input.payload}::jsonb`,
}) |
The workarounds discussed here will fails when using an array instead of an object. My column type is This is the code I am using: const example = {identifier: "xyz", hostnames: ["foo.bar.com", "bar.foo.com"]}
await db.insert(assets).values(input).onConflictDoUpdate({
target: assets.identifier,
set: input,
});
return db.execute(
sql`UPDATE assets SET hostnames = ${input.hostnames} WHERE identifier = ${input.identifier}`,
); The constructed query looks like this, which is of course wrong because it is using one argument for each array element:
Any suggestions? Edit: Using this custom jsonb type fixed it for me: #666 (comment) |
@rverton same issue with me |
I used raw sql but it didn't work for me because it stores the array as object. Found a work around using custom type here and it worked for me. I don't need to use raw sql, and the array will still get stored as array. |
Any updates? Do you accept a PR? |
I am curious why this was implemented in such way at first place and have not been patched to be the expected beahviour for months. |
a better workaround #666 (comment) |
In case someon
This solution works great, but I had a problem with it because I insert an array into jsonb field and for some reason when I passed this array and it had 2+ elements, the elements were destructured, i.e.: query: const array_for_jsonb_field = [{"key1": 1}, {"key2": 2}]
sql`INSERT INTO "my_table" VALUES (${id}, ${val}, ${array_for_jsonb_field}`) result: INSERT INTO "my_table" VALUES ($1, $2, ($3, $4)) // $3 and $4 should be a single array but this array with 2 elements got destructured into 2 params In case someone encounters the same problem, use can do the following: const array_for_jsonb_field = [{"key1": 1}, {"key2": 2}]
sql`INSERT INTO "my_table" VALUES (${id}, ${val}, ${new Param(array_for_jsonb_field)}`) // I wrapped array with `new Param` result: INSERT INTO "my_table" VALUES ($1, $2, $3) |
My use case: Table Schema
But if i use
Then API Resonse is same but Table Plus Using this #666 (comment) also fixed issue can confirm |
I just contributed to the bounty on this issue. Each contribution to this bounty has an expiry time and will be auto-refunded to the contributor if the issue is not solved before then. To make this a public bounty or have a reward split, the maintainer can reply to this comment. |
I just contributed to the bounty on this issue: https://until.dev/bounty/drizzle-team/drizzle-orm/724 The current bounty for completing it is $70.00 if it is closed within 27 days, and decreases after that. Others can also contribute to the bounty to increase it. |
@MariuzM I don't think this is an issue, but it's by design. The JSONB stores JSON data as a binary representation of the JSONB value, which eliminates whitespace, duplicate keys, and key ordering. If key ordering is important for you, use JSON instead, but you'll lose all the other benefits of JSONB. |
@MariuzM as indicated by @pfurini its postgres jsonb design. By contrast, jsonb does not preserve white space, does not preserve the order of object keys, and does not keep duplicate object keys. If duplicate keys are specified in the input, only the last value is kept. |
This diff --git a/node_modules/drizzle-orm/postgres-js/driver.js b/node_modules/drizzle-orm/postgres-js/driver.js
index 7e48e8c..219e0a0 100644
--- a/node_modules/drizzle-orm/postgres-js/driver.js
+++ b/node_modules/drizzle-orm/postgres-js/driver.js
@@ -12,6 +12,8 @@ function drizzle(client, config = {}) {
client.options.parsers[type] = transparentParser;
client.options.serializers[type] = transparentParser;
}
+ client.options.serializers['114'] = transparentParser;
+ client.options.serializers['3802'] = transparentParser;
const dialect = new PgDialect();
let logger;
if (config.logger === true) { |
This is urgently needed to be fixed |
I just wanted to point out that this issue breaks creating and editing ☝️ Those of you who applied |
Out of curiosity, would this not introduce a possibility for SQL injection? Or does doing |
There has got to be a solution. Major version change if needed but the current state is really broken! |
@tonyxiao Agreed, this is a major issue that needs to be addressed |
Any updates on this? |
I resolved this issue by writing a custom wrapper which handles jsonb conversion regardless of the shape of the data. |
Finally after hours of finding a good solution. Yours worked for me. Cheers! |
Glad that it works. Cheers! |
Can this be closed now that #1785 is merged? |
It was merged and fixed by patching a client you are providing to drizzle.
It is available in Before latest I will prepare a guide on how to fix an existing database and will reuse and mention some of the comments from this issue and a PR I've merged |
Should be fixed in |
@gczh the
|
What version of
drizzle-orm
are you using?0.26.5
What version of
drizzle-kit
are you using?0.18.1
Describe the Bug
Inserting an object into a postgres jsonb field with db.insert only inserts a string when using the postgres-js.adapter.
Expected behavior
With the pg package, an object is inserted using the code below, which is the expected behavior.
With the postgres-js package, a string is inserted into the table using the same code.
Environment & setup
drizzle packages as above, and, "pg": "8.11.0" and "postgres": "3.3.5"
schema.ts:
import { pgTable, jsonb } from "drizzle-orm/pg-core";
export const logs = pgTable("log", {
line: jsonb("line").$type(),
});
load.ts:
let lines: { line: object }[] = [];
let n = 0;
for await (const line of rl) {
const lineObj = JSON.parse(line);
lines.push({ line: lineObj });
}
await runFunction(lines);
runFunction: async (lines) => {
await db.insert(logs).values(lines);
}
The text was updated successfully, but these errors were encountered: