Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Pg] Fix jsonb field escaping data before insert #666

Closed
wants to merge 15 commits into from

Conversation

LeonAlvarez
Copy link

@LeonAlvarez LeonAlvarez commented May 30, 2023

RN Jsonb column is incorrectly escaping data before inserting it which end up in "{\"external_id\":40}" instead of {"external_id": 40}. Because of that queries using jsonb syntax are not working for example the following query:

SELECT * FROM categories
WHERE (meta_data->>'external_id')::int = 164;

on current drizzle status we are forced to first cast column to jsonb as escaped data is just a string (which can be stored on json be but it really isnt jsonb)

SELECT * FROM categories
WHERE ((meta_data #>> '{}')::jsonb->>'external_id')::int= 164;

fix issue with escaping data on jsonb pg
@LeonAlvarez LeonAlvarez changed the title Fix jsonb field escaping data before insert [Pg] Fix jsonb field escaping data before insert May 30, 2023
Copy link
Contributor

@dankochetov dankochetov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The integration tests fail with this change.

@LeonAlvarez
Copy link
Author

LeonAlvarez commented Jun 4, 2023

The integration tests fail with this change.

I have fixed postgresjs test, now should I extend insertjson with a proper jsonb query like

test.serial('json insert', async (t) => {
	const { db } = t.context;

	await db.insert(usersTable).values({ name: 'John', jsonb: ['foo', 'bar'] });
	const result = await db.select({
		id: usersTable.id,
		name: usersTable.name,
		jsonb: usersTable.jsonb,
	}).from(usersTable);

	t.deepEqual(result, [{ id: 1, name: 'John', jsonb: ['foo', 'bar'] }]);
	const foo = await db.select({
		id: usersTable.id,
		name: usersTable.name,
		jsonb: usersTable.jsonb,
	}).from(usersTable).where(
		and(
			eq(usersTable.name, 'John'),
			sql.raw(`jsonb->>0 = 'foo'`),
		)
	);
	t.deepEqual(foo, [{ id: 1, name: 'John', jsonb: ['foo', 'bar'] }]);
});

Or better to leave just the jsonb query?

test.serial('json insert', async (t) => {
	const { db } = t.context;
	
	await db.insert(usersTable).values({ name: 'John', jsonb: ['foo', 'bar'] });
	const result = await db.select({
		id: usersTable.id,
		name: usersTable.name,
		jsonb: usersTable.jsonb,
	}).from(usersTable).where(
		and(
			eq(usersTable.name, 'John'),
			sql.raw(`jsonb->>0 = 'foo'`),
		)
	);
	
	t.deepEqual(result, [{ id: 1, name: 'John', jsonb: ['foo', 'bar'] }]);
});

This test will fail on by branch for pg driver Is there a way to do mapToDriver by driver?

@dankochetov
Copy link
Contributor

dankochetov commented Jun 7, 2023

now should I extend insertjson with a proper jsonb query

sure, go ahead @LeonAlvarez

@oswaldoacauan
Copy link

great! jsonb support is a must <3

@LeonAlvarez
Copy link
Author

For now I'm solving it on user land as there is no simple way to change behaviour of type for each driver on core.

If you are using postgresjs and jsonb simply create a custom jsonb type and use it isntead of the one from pg-core

import { customType } from 'drizzle-orm/pg-core';

const jsonb = customType<{ data: any }>({
  dataType() {
    return 'jsonb';
  },
  toDriver(val) {
    return val as any;
  },
  fromDriver(value) {
    if (typeof value === 'string') {
      try {
        return JSON.parse(value) as any;
      } catch {}
    }
    return value as any;
  },
});

export default jsonb;

@stunaz
Copy link

stunaz commented Jun 22, 2023

Using your workaround @LeonAlvarez , its working great.

@ivanfeli
Copy link

This is what I'm using based on @LeonAlvarez's solution with TS types and based on Drizzle's own example: https://orm.drizzle.team/docs/custom-types (except driverData should be TData not string for it to work)

I assume it will work with 'jsonb' too.

export const customJson = <TData,>(name: string) =>
    customType<{ data: TData; driverData: TData }>({
        dataType() {
            return 'json'
        },
        toDriver(val: TData): TData {
            return val
        },
        fromDriver(value): TData {
            if (typeof value === 'string') {
                try {
                    return JSON.parse(value) as TData
                } catch {}
            }
            return value as TData
        },
    })(name)

@nikosantis
Copy link

Any updates? 🥇

@realverse
Copy link

Has anyone managed to use the customType or an alternative approach to insert when the column type is jsonb[] ? The customType approach worked for jsonb, but I'm stuck on jsonb[]...

create table test (
id serial primary key,
data jsonb,
list jsonb[]
);

export const test = pgTable("test", {
id: serial("id").primaryKey().notNull(),
data: jsonb("data"),
list: jsonb("list").array(),
});

await db.insert(test).values({
data: {
name: 'Ali',
age: 34
},
list: [
{ animal: 'pig', color: 'pink' },
{ animal: 'bear', color: 'brown' },
]
});

I can write/read data fine. But getting below on list...

Query: insert into "test" ("id", "data", "list") values (default, $1, $2, $3) -- params: [{"name":"Ali","age":34}, "{[object Object],[object Object]}"]
invalid input syntax for type json

@xlc
Copy link

xlc commented Nov 14, 2023

I don't know if there is a better way but this works fine for me

export const customJsonb = <TData>(name: string) =>
  customType<{ data: TData; driverData: TData }>({
    dataType() {
      return 'jsonb'
    },
    toDriver(val: TData) {
      return sql`(((${JSON.stringify(val)})::jsonb)#>> '{}')::jsonb`
    },
    fromDriver(value): TData {
      return value as TData
    },
  })(name)

and this can convert all the stringifyed JSON

UPDATE table_name
SET column1 = (column1#>> '{}')::jsonb,
    column2 = (column2#>> '{}')::jsonb
;

@Angelelz
Copy link
Collaborator

I was going to work on this issue but I noticed this PR. What's the status @LeonAlvarez ? Let me know if you can complete the fix or if you'll prefer me to take over.

@Angelelz
Copy link
Collaborator

Closing this in favor of #1641

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants