-
-
Notifications
You must be signed in to change notification settings - Fork 618
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG]: jsonb type on postgres implement incorrectly #1511
Comments
is this accepted bug, or i implemented incorrectly ? i already try to create customType jsonb, but still cannot create jsonb data that i can access with ->> operator in postgres. |
Hello @primadi, Regarding your second question : console.log(
await db.query.tbl01.findMany({
columns: {
id: true,
jsonb_col: true,
// how i add field01 and field02 in here ??
},
})
) You can add an console.log(
await database.query.tbl01.findMany({
columns: {
id : true,
jsonb_col : true,
},
extras: {
field01: sql<number>`${table01.jsonb_col}->>'field01'`.as('field01'), // incorrect: return null, it should be "100"
}
})
) Though I'm having the same issue regarding the jsonb accessing keys... |
I attempted this directly in Postgres and I couldn't get I to work without drizzle involved. |
Hi @Angelelz select id, jsonb_col, jsonb_col->>'field01' as field01 from tbl01 it incorrectly return null on field01, it should be 100 with above data.. but if we remove double apostrophe (") in front and end of jsonb_col value using pgAdmin, it will return result correctly. Thank you. |
@thomas-ndlss it works for second question. thanks. |
I just ran the following query in both a local PG database and in supabase console and all I get is null. Please correct any mistake I might have: CREATE TABLE IF NOT EXISTS public.tbl01
(
id text NOT NULL primary key,
jsonb_col jsonb NOT NULL
);
insert into "tbl01" ("id", "jsonb_col") values ('id01', '"{\"field01\":100,\"field02\":\"string 100\"}"') returning "id", "jsonb_col";
select id, jsonb_col, jsonb_col->>'field01' as field01, jsonb_path_exists(jsonb_col, '$.field01') from "tbl01"; Edit: My result from supabase:
|
Hi @Angelelz , this is the correct insert : insert into "tbl01" ("id", "jsonb_col") values ('id02', '{"field01":100,"field02":"string 100"}') returning "id", "jsonb_col"; with the correct insert, it will return the correct result for this query : select id, jsonb_col, jsonb_col->>'field01' as field01, jsonb_path_exists(jsonb_col, '$.field01') from "tbl01"; field01: 100 |
So the problem is actually on insert. I thought that if you didn't have valid json, the database wouldn't let you insert it? console.log(
await db
.insert(tbl01)
.values({ id: "id01", jsonb_col: JSON.stringify({ field01: 100, field02: "string 100" }) })
.returning()
) |
it produces data in jsonb_col:
field01: null |
i try: console.log(
await db
.insert(tbl01)
.values({
id: "id02",
jsonb_col: sql`'${JSON.stringify({
field01: 100,
field02: "string 100",
})}'`,
})
.returning()
) but it produce error:
|
I had a similar issue. To get around the issue I created a customJsonb type where JSON.stringify is skipped: const customJsonb = <TData>(name: string) =>
customType<{ data: TData; driverData: string }>({
dataType() {
return 'jsonb';
},
toDriver(value: TData) {
return value;
}
})(name); Drizzle throws a typescript error in the editor but it works and all the JSON functions work, ->> as well. I did some more checking in the logs of my server and see that Drizzle sends an SQL statement with the following parameter when using the jsonb type provided by Drizzle: |
@cbasje your solution is worked, thanks. I hope Drizzle team can fixed this. |
There might be an error in your code @primadi I've changed the [
{
id: 'id01',
jsonb_col: { field01: 100, field02: 'string 100' },
field01: 100,
field02: 'string 100'
}
]
with Here's my full code: import 'dotenv/config';
import { sql } from 'drizzle-orm';
import { drizzle as drizzleORM } from 'drizzle-orm/node-postgres';
import { jsonb, pgTable, text } from 'drizzle-orm/pg-core';
import { exit } from 'node:process';
import postgres from 'pg';
const tbl01 = pgTable('tbl01', {
id: text('id').primaryKey().notNull(),
jsonb_col: jsonb('jsonb_col').notNull(),
});
export const pool = new postgres.Pool({
connectionString: process.env.DATABASE_URL,
});
export const db = drizzleORM(pool);
async function main() {
console.log('Clear all');
await db.delete(tbl01);
console.log('INSERT INTO tbl01');
console.log(
await db
.insert(tbl01)
.values({
id: 'id01',
jsonb_col: { field01: 100, field02: 'string 100' },
})
.returning()
);
console.log('SELECT QUERY FROM tbl01');
console.log(
await db
.select({
id: tbl01.id,
jsonb_col: tbl01.jsonb_col,
field01: sql`${tbl01.jsonb_col}->'field01'`, // 100
field02: sql`${tbl01.jsonb_col}->'field02'`, // "string 100"
})
.from(tbl01)
);
}
try {
await main();
} catch (error) {
console.error(error.message);
exit(1);
}
exit(0); |
The code by @rogiervandenberg above does work because I figured out that this problem is not present when using node-postgres. For some reason, it is only present in the connection with PostgresJS. I see from the related PR that it is an issue with PostgresJS itself: porsager/postgres#392. So, maybe a better solution for now is to use node-postgres instead of my customType solution 😅. |
I've wrote a helper function to deal with nested Jsonb fields in a type-safe way. Maybe it can be helpful for someone.
You can use it like this:
All the arguments will be type safe. (this assumes that you defined the type of the jsonb field in the schema declaration using the |
It was merged and fixed by patching a client you are providing to drizzle.
It is available in Before latest I will prepare a guide on how to fix an existing database and will reuse and mention some of the comments from this issue and a PR I've merged |
Should be fixed in |
What version of
drizzle-orm
are you using?0.29.0
What version of
drizzle-kit
are you using?0.20.1
Describe the Bug
Expected behavior
jsonb type incorrectly save data as json string, so we cannot query data field using ->> operator in the postgres.
Environment & setup
No response
The text was updated successfully, but these errors were encountered: