Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Insert multi rows in one query #55

Closed
wants to merge 1 commit into from
Closed

Conversation

panczypani
Copy link

Method 2 for insert- mass insert by one query

$data[]=array('col1'=>'1','col2'=>'2');
$data[]=array('col2'=>'4','col1'=>'3');
$data[]=array('col1'=>'5','col2'=>'6');
$row=ORM::for_table('aaa')->create();
$row->save_multi($data);

Result query:
INSERT INTO aaa (one, two) VALUES (?, ?), (?, ?)

@treffynnon
Copy link
Collaborator

I am not really sure how often this will actually be useful. Does anyone else have an opinion on this one ( @durhamhale )? I am leaning towards avoiding it to keep complexity down.

@treffynnon
Copy link
Collaborator

See issue #40 as they seem related also

@tag
Copy link
Contributor

tag commented Nov 20, 2012

While I think it should be avoided to preserve simplicity, it might be useful in cases where latency between application server and database server is high. (This should be a rare use case.)

@treffynnon
Copy link
Collaborator

I am going to put the same time limit on this pull request as the MS SQL TOP pull request #37. If I hear no fresh calls for this or justifications for inclusion in the library by 2013 then it will be closed without merge.

@panczypani
Copy link
Author

This have very better performance for many data which you can see at below test.

$ time php query.php 
Preparing data done

real    0m5.797s
user    0m3.108s
sys 0m0.048s
$ time php query2.php 

real    0m36.481s
user    0m0.304s
sys 0m0.104s
$ time php query.php 
Preparing data done

real    0m7.793s
user    0m2.884s
sys 0m0.064s
$ time php query2.php 

real    0m39.563s
user    0m0.648s
sys 0m0.116s

https://gist.github.com/6db54b4e6eb9d0e133c6
I wrote this patch, because I had 250k record to insert. I feel difference after I changed method.
As you can see in code I connected to localhost mysql server. More repeat is not necessary, because difference is very big.

@tag
Copy link
Contributor

tag commented Dec 19, 2012

I'll repeat my comment on the gist here:

Data being submitted is different. In line 19 of query.php [in the gist] perhaps you mean:

    $data[]=array('key'=>$key,'bar'=>$i);    // value for 'key' changed from original

There are probably a handful of ways to speed up the first implementation, if it were important. One that comes to mind first is doing the for_table outside the loop, then cloning it. Using Idiorm's pending raw_execute() method might be even faster than both of the methods shown in the gist.

@treffynnon
Copy link
Collaborator

This isn't really what Idiorm is intended for. If you are handling such volumes of data then you are after the greatest efficiency you can get and that is not going to be through an ORM. I am still not convinced that this is a worthy feature of a light ORM.

@tag mentions a good alternative in using the raw_execute() method as this is effectively a convenience wrapper around the PDO instance. Which with such a large data set would suit you as you are going to want to be as close to native MySQL as possible.

@treffynnon
Copy link
Collaborator

Thank you for the pull request, but unfortunately in this case I am going to be closing it without merge. Please see @tag comments for raw_execute usage.

@treffynnon treffynnon closed this Jan 3, 2013
@treffynnon treffynnon mentioned this pull request Mar 11, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants