You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can obtain connection with ConnectionFactoryUtils helper method: val connection = ConnectionFactoryUtils.getConnection(connectionFactory).awaitSingle()
Currently, we don't have support for batching. There are two types of batches that R2DBC supports:
Unparametrized via Connection.createBatch()
Parametrized via Statement.add()
Unparametrized batches could be supported through execute(List<String>). However, the execute interfaces expose bind(…) methods that do not seem appropriate in that context and we would require introducing another set of interfaces.
Parametrized statements seem straight-forward, but there's a caveat that is due to named parameter resolution. Named parameter support considers if a bound value is a collection type. If so, then the SQL expansion creates a parameter placeholder for each element in the Collection
Example:
client.execute("INSERT INTO foo VALUES(:my_param)")
.bind("my_param", "a-value")
Resulting SQL (Postgres syntax):
INSERT INTO foo VALUES($1)
client.execute("INSERT INTO foo VALUES(:my_param)")
.bind("my_param", Arrays.asList("one", "two"))
Resulting SQL (Postgres syntax):
INSERT INTO foo VALUES($1, $2)
For batching, this example makes little sense as binding a collection to an INSERT has very little use. The point I'm trying to make is that if the parameter multiplicity changes across bindings with named parameters processing enabled, the resulting SQL is no longer the same and we cannot use batching.
client.execute("INSERT INTO foo VALUES(:my_param)")
.bind("my_param", Arrays.asList("one"))
.add()
.bind("my_param", Arrays.asList("one", "two"))
client.execute("INSERT INTO foo VALUES(:my_param)")
.bind("my_param", Arrays.asList("one"))
.add()
.bind("my_param", Arrays.asList("one", "two"))
I assume the client is the DatabaseClient where the DatabaseClient#bind method returns a BindSpec, not a Statement object where the Statement#add is available.
Is it possible to use parameterized statements to form batching via DatabaseClient currently?
In addition, is it possible to use parameterized insert statement with multiple values? For instance, INSERT INTO foo VALUES ("foo", "bar"), ("FOO", "BAR"). Can I do something like:
Hi, is there any update on this? I couldn't find the support for batch operations using DatabaseClient. Can you please confirm if the current batch operations support is limited to Statement and Connection objects?
Hey @mp911de ,
I have a scenario where my table has an autogenerated id column and I need to bulk insert items into db and fetch the generated ids. Is there any way I can achieve that?
This is my table:
CREATE TABLE test_table (
`id` SERIAL NOT NULL,
`name` VARCHAR(100) NOT NULL,
`created_date` DATETIME NOT NULL,
PRIMARY KEY (`id`)
);
Depending on the database type, you need to tell the database to return generated keys (see the Postgres documentation on RETURNING). Then, extract the generated keys by consuming Result.map((row, metadata) -> row.get("name_of_id_column", Long.class)).
Thanks @luiccn for reminding us. Meanwhile, DatabaseClient went into Spring Framework and we need to move this ticket there.
Keep in mind that contributing to open source is essential if you want something to happen sooner than it would take the regular way. We'd be more than happy to work with you on the final design if you find the time to come up with a proposal and a pull request.
Is there still no batching possible, except directly via the Connection?
Using the raw batches at the connection is risky because all escaping has to be done "by hand".
Batches perform way better than multiple inserts, even if using Statement.add().add()....execute(). I wonder why Statement.add() does not use a batch under the hood?
Activity
istarion commentedon Dec 19, 2019
You can obtain connection with ConnectionFactoryUtils helper method:
val connection = ConnectionFactoryUtils.getConnection(connectionFactory).awaitSingle()
deblockt commentedon Dec 19, 2019
Thanks, so far I have used
ConnectionAccessor
to get the same connection asDatabaseClient
.mp911de commentedon Jan 22, 2020
Currently, we don't have support for batching. There are two types of batches that R2DBC supports:
Connection.createBatch()
Statement.add()
Unparametrized batches could be supported through
execute(List<String>)
. However, the execute interfaces exposebind(…)
methods that do not seem appropriate in that context and we would require introducing another set of interfaces.Parametrized statements seem straight-forward, but there's a caveat that is due to named parameter resolution. Named parameter support considers if a bound value is a collection type. If so, then the SQL expansion creates a parameter placeholder for each element in the
Collection
Example:
Resulting SQL (Postgres syntax):
Resulting SQL (Postgres syntax):
For batching, this example makes little sense as binding a collection to an
INSERT
has very little use. The point I'm trying to make is that if the parameter multiplicity changes across bindings with named parameters processing enabled, the resulting SQL is no longer the same and we cannot use batching.gjgarryuan commentedon Jun 18, 2020
@mp911de
I tried the code sample you provided above
I assume the
client
is theDatabaseClient
where theDatabaseClient#bind
method returns aBindSpec
, not aStatement
object where theStatement#add
is available.Is it possible to use parameterized statements to form batching via
DatabaseClient
currently?In addition, is it possible to use parameterized insert statement with multiple values? For instance,
INSERT INTO foo VALUES ("foo", "bar"), ("FOO", "BAR")
. Can I do something like:spachip commentedon Jul 15, 2020
Hi, is there any update on this? I couldn't find the support for batch operations using DatabaseClient. Can you please confirm if the current batch operations support is limited to Statement and Connection objects?
abhinaba-chakraborty-by commentedon Sep 15, 2020
Hey @mp911de ,
I have a scenario where my table has an autogenerated id column and I need to bulk insert items into db and fetch the generated ids. Is there any way I can achieve that?
This is my table:
To save a list of items, the code I am using:
The generated SQL statement (from DEBUG Logs):
I even tried using
ConnectionFactory
, still no cluemp911de commentedon Sep 22, 2020
Depending on the database type, you need to tell the database to return generated keys (see the Postgres documentation on
RETURNING
). Then, extract the generated keys by consumingResult.map((row, metadata) -> row.get("name_of_id_column", Long.class))
.luiccn commentedon Dec 17, 2020
Any updates on this? It's quite an important feature to be overlooked for so long. Tomorrow it's the 1 year birthday of no batch inserts :(
mp911de commentedon Dec 17, 2020
Thanks @luiccn for reminding us. Meanwhile,
DatabaseClient
went into Spring Framework and we need to move this ticket there.Keep in mind that contributing to open source is essential if you want something to happen sooner than it would take the regular way. We'd be more than happy to work with you on the final design if you find the time to come up with a proposal and a pull request.
aoudiamoncef commentedon Jun 7, 2021
Hi @mp911de,
I'm interested to work on this issue, please could you give me more information about it
Thanks
markusheiden commentedon Jul 9, 2023
Is there still no batching possible, except directly via the Connection?
Using the raw batches at the connection is risky because all escaping has to be done "by hand".
Batches perform way better than multiple inserts, even if using Statement.add().add()....execute(). I wonder why Statement.add() does not use a batch under the hood?
5 remaining items