0

I have written a python script which import CSV file in MySQL db. I need to check the speed of read/write to MySQL because I need to optimzed the speed and time. I search and find this command

time ./royshah.py

which gives results like this

real    0m2.579s
user    0m0.448s
sys     0m0.192s

my file size is 86 kb and it consists of 1200 rows and columns are 9. my question is

is there any other command or useful profiling tool for python and MySQL
to measure more precisely speed  time other than
this command "time ./file-name" ?

I tried my best to find one but unable to find. thanx for the help.

roy
  • 305
  • 1
  • 2
  • 9
  • what you really need is LOAD DATA INFILE. – e4c5 Dec 27 '16 at 12:23
  • @e4c5 I know about LOAD DATA INFILE and also about multiple row insertions. . I look here at SO http://stackoverflow.com/questions/22495162/speed-up-inserting-large-datasets-from-txt-file-to-mysql-using-python and some sites. I am just asking for command other than "time ./filename" to measure time accurately. – roy Dec 27 '16 at 12:30
  • @e4c5 Speed improvement is not an issue. I can use multiple rows insertions or data load infile. .I ask for guidance regarding time measurement. thanx – roy Dec 27 '16 at 12:32
  • asking for external resources happens to be off topic – e4c5 Dec 27 '16 at 12:43
  • Less than a minute with my favorite search engine turned up some MySQL profiling tools. You can probably find the same sort of information. Probably the most important aspects of MySQL bulk-load performance are database server machine load average (cpu) and database IO channel (disk drive) saturation. Your sample dataset is probably a factor of 100 too small to measure those values with any precision. 500 rows per second is pretty good performance, especially if you haven't optimized it at all. – O. Jones Dec 27 '16 at 12:45
  • http://stackoverflow.com/questions/582336/how-can-you-profile-a-python-script – bruno desthuilliers Dec 27 '16 at 12:54
  • @brunodesthuilliers thanx for such a useful link. – roy Dec 27 '16 at 13:22
  • @e4c5 thanx for the hint. I will be careful in future. . – roy Dec 27 '16 at 13:22
  • @O.Jones I have a dataset which consist of 3 millions rows and size is 14 GB. .Thanx for giving the information regarding Bulk-load performance. I will took them into account when dealing with this large dataset soon. I am really thankful for the help . – roy Dec 27 '16 at 13:25
  • I instrument the program with microsecond timers before and after commands. (I don't know the python specifics.) – Rick James Dec 27 '16 at 23:47

0 Answers0