[Add] Multiple insert with logging implemented #302
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The logged_query function doesn't particularly fit for cases where the query is a multiple rows insertion statement, as one would have to call it for each row insertion separetely.
Recently I had to populate a new table based on the rows from already existing table, and such function would come in handy, especially as it takes exactly the type of argument (sequence of tuples) that a fetch after select from another table can return. Pure SQL INSERT INTO SELECT is the perfect solution if you don't have to handle those values before inserting, but you may have to construct a new list of tuples with somehow modified values or additional values that couldn't be added to the tuples via SELECT. So it probably makes sense to have a function like this for such cases.
I used the cr.mogrify() + cr.execute() approach instead of cr.executemany() as the former is generally believed to perform better.