1

I have a table with about 50K rows. I need to multiply this data 10 fold to have at least 5M rows for testing the performance. Now, its taken me several long minutes to import 50K from a CSV file so I don't want to create a 5M record file and then import it into SQL.

Is there a way to duplicate the existing rows over and over again to create 5M records? I don't mind if the rows are identical, they should just have a diferrent id which is the Primary (Auto Increment) column.

I'm currently doing this on XAMPP with phpMyAdmin.

Whip
  • 1,891
  • 22
  • 43

3 Answers3

2
Insert into my_table (y,z) select y, z from my_table;  

where x is your autoincrementing id.

REPEAT a (remarkably small) number of times

Strawberry
  • 33,750
  • 13
  • 40
  • 57
0

Option 1 : Use union

 insert into your_table (col1,col2)
    select col1,col2  from your_table
    union all 
    select col1,col2 from your_table
    union all 
    select col1,col2 from your_table
    union all 
    select col1,col2 from your_table
    continued...

Option 2 : Use dummy table with 10 records and do cross join

Create a dummy table with 10 rows

insert into your_table (col1,col2)
select col1,col2 from your_table, dummy_table
Akhil
  • 2,602
  • 23
  • 36
0

If you have ~50K rows, then copying them 99 times will give you ~5M rows. To do so, you can create a procedure and use a loop to copy them 99 times.

DELIMITER $$
CREATE PROCEDURE populate()
BEGIN
  DECLARE counter INT DEFAULT 1;

  WHILE counter < 100 DO
    insert into mytable(colA, colB) select colA, colB from mytable;
    SET counter = counter + 1;
  END WHILE;
END $$

DELIMITER ;

Then you can call the procedure using

call populate();
sid-m
  • 1,504
  • 11
  • 19
  • Hey, I have to say that it isn't 99 times but around 6 ! You see, since you are coping the same table over and over, the amount of lines is exponentially growing. – Hydrocat Oct 15 '19 at 07:50