1

I have a multiple clients to single server bidirectional iperf set-up for network monitoring. The iperf server runs well and displays output in CSV format based on the cron jobs written on the client end.

I wish to write a python script to automate the process of mapping these CSV outputs to a MySQL database; which in turn would be updated and saved at regular intervals without need of human intervention.

I am using a Ubuntu 13.10 machine as the iperf server. Following is a sample CSV output that I get. This is not being stored to a file, just being displayed on screen.

s1:~$ iperf -s -y C  
20140422105054,172.16.10.76,41065,172.16.10.65,5001,6,0.0-20.0,73138176,29215083
20140422105054,172.16.10.76,5001,172.16.10.65,56254,4,0.0-20.0,46350336,18502933
20140422105100,172.16.10.76,54550,172.16.10.50,5001,8,0.0-20.0,67895296,27129408
20140422105100,172.16.10.76,5001,172.16.10.50,58447,5,0.0-20.1,50937856,20292796
20140422105553,172.16.10.76,5001,172.16.10.65,47382,7,0.0-20.1,51118080,20358083
20140422105553,172.16.10.76,41067,172.16.10.65,5001,5,0.0-20.1,76677120,30524007
20140422105600,172.16.10.76,5001,172.16.10.50,40734,4,0.0-20.0,57606144,23001066
20140422105600,172.16.10.76,54552,172.16.10.50,5001,8,0.0-20.0,70123520,28019115
20140422110053,172.16.10.76,41070,172.16.10.65,5001,5,0.0-20.1,63438848,25284066
20140422110053,172.16.10.76,5001,172.16.10.65,46462,6,0.0-20.1,11321344,4497094

The fields I want to map them to are: timestamp, server_ip, server_port, client_ip, client_port, tag_id, interval, transferred, bandwidth

I want to map this CSV output periodically to a MySQL database, for which I do understand that I would have to write a Python script (inside a cron job) querying and storing in MySQL database. I am a beginner at Python scripting and database queries.

I went through another discussion on Server Fault at [https://serverfault.com/questions/566737/iperf-csv-output-format]; and would like to build my query based on this.

Community
  • 1
  • 1
shyene88
  • 13
  • 1
  • 3

3 Answers3

0

If you plan using Python, then I would recommend using sqlalchemy

General approach is:

  • define class, which has all the attributes, you want to store
  • map all the properties of the class to database columns and types
  • read your data from csv (using e.g. csv module), for each row create corresponding object being the class prepared before, and let it to be stored.

sqlalchemy shall provide you more details and instructions, your requirement seems rather easy.

Other option is to find out an existing csv import tool, some are already available with MySQL, there are plenty of others too.

Jan Vlcinsky
  • 42,725
  • 12
  • 101
  • 98
  • Yes, MySQL has import from CSV tools; but my IPERF tool just displays CSV data - I'm not sure where it stores it.| So based on that, I want to automate the CSV data collection from the IPERF tool and put it to MySQL database at regular intervals. – shyene88 Apr 25 '14 at 19:45
  • Do you know of other tools which may help me in my goal? – shyene88 Apr 25 '14 at 19:47
  • You can redirect the output into a file like `$ iperf -s -y C > data.csv` If you manage preparing required table structure for your csv import in database, and use the csv import tool provided with myslq, you do it in very direct way - this sounds good. – Jan Vlcinsky Apr 25 '14 at 19:48
0

This probably is not the kind of answer you are looking for, but if you learn a little sqlite3 (a native Python module - "import sqlite3") by doing a basic tutorial online, you will realize that your problem is not at all difficult to solve. Then just use a standard timer, such as time.sleep() to repeat the procedure.

user3439187
  • 613
  • 1
  • 7
  • 10
0

Generate SQL script, then run it

If you do not want to use complex solutions like sqlalchemy, following approach is possible.

  1. having your csv data, convert them into SQL script
  2. use mysql command line tool to run this script

Before you do it the first time, be sure you create needed database structure in the database (this I leave to you).

My following sample uses (just for my convenience) package docopt, so you need installing it:

$ pip install docopt

CSV to SQL script conversion utility

csv2sql.py:

"""
Usage:
    csv2sql.py [--table <tablename>] <csvfile>

Options:
    --table <tablename>  Name of table in database to import into [default: mytable]

Convert csv file with iperf data into sql script for importing
those data into MySQL database.
"""
from csv import DictReader
from docopt import docopt

if __name__ == "__main__":
    args = docopt(__doc__)
    fname = args["<csvfile>"]
    tablename = args["--table"]
    headers = ["timestamp",
               "server_ip",
               "server_port",
               "client_ip",
               "client_port",
               "tag_id",
               "interval",
               "transferred",
               "bandwidth"
              ]
    sql = """insert into {tablename}
    values ({timestamp},"{server_ip}",{server_port},"{client_ip}",{client_port},{tag_id},"{interval}",{transferred},{bandwidth});"""
    with open(fname) as f:
        reader = DictReader(f, headers, delimiter=",")
        for rec in reader:
            print(sql.format(tablename=tablename, **rec)) # python <= 2.6 will fail here

Convert CSV to SQL script

First let the conversion utility introduce:

$ python csv2sql.py -h
Usage:
    csv2sql.py [--table <tablename>] <csvfile>

Options:
    --table <tablename>  Name of table in database to import into [default: mytable]

Convert csv file with iperf data into sql script for importing
those data into MySQL database.

Having your data in file data.csv:

$ python csv2sql.py data.csv 
insert into mytable
    values (20140422105054,"172.16.10.76",41065,"172.16.10.65",5001,6,"0.0-20.0",73138176,29215083);
insert into mytable
    values (20140422105054,"172.16.10.76",5001,"172.16.10.65",56254,4,"0.0-20.0",46350336,18502933);
insert into mytable
    values (20140422105100,"172.16.10.76",54550,"172.16.10.50",5001,8,"0.0-20.0",67895296,27129408);
insert into mytable
    values (20140422105100,"172.16.10.76",5001,"172.16.10.50",58447,5,"0.0-20.1",50937856,20292796);
insert into mytable
    values (20140422105553,"172.16.10.76",5001,"172.16.10.65",47382,7,"0.0-20.1",51118080,20358083);
insert into mytable
    values (20140422105553,"172.16.10.76",41067,"172.16.10.65",5001,5,"0.0-20.1",76677120,30524007);
insert into mytable
    values (20140422105600,"172.16.10.76",5001,"172.16.10.50",40734,4,"0.0-20.0",57606144,23001066);
insert into mytable
    values (20140422105600,"172.16.10.76",54552,"172.16.10.50",5001,8,"0.0-20.0",70123520,28019115);
insert into mytable
    values (20140422110053,"172.16.10.76",41070,"172.16.10.65",5001,5,"0.0-20.1",63438848,25284066);
insert into mytable
    values (20140422110053,"172.16.10.76",5001,"172.16.10.65",46462,6,"0.0-20.1",11321344,4497094);

Put it all into file data.sql:

$ python csv2sql.py data.csv > data.sql

Apply data.sql to your MySQL database

And finally use mysql command (provided by MySQL) to do the import into database:

$ myslq --user username --password password db_name < data.sql 
Jan Vlcinsky
  • 42,725
  • 12
  • 101
  • 98
  • Jan, I tried the above Python script you provided. I keep on getting the following syntactical error. Could you please help troubleshoot it? I tried changing the syntax and see if it has other results. Also, out of curiosity, does a Python script need an "end module" type statement? ------------ File "csv2sql.py", line 22 print sql.format(tablename=tablename, **rec) ^ SyntaxError: invalid syntax ------------ – shyene88 Apr 28 '14 at 17:04
  • My code runs with Python 2.7. In case you are in Python 3.x, print statement must be changed to function call like `print(sql.format(tablename=tablename=tablename, **rec)`. I will rewrite the code to run in both verisons. – Jan Vlcinsky Apr 28 '14 at 17:11
  • Weird error: s1:~$ python csv2sql.py iperf2.csv > iperf2.sql File "csv2sql.py", line 22 print(sql.format(tablename=tablename, **rec)) ^ TabError: inconsistent use of tabs and spaces in indentation. I used the same syntax as guided by you. – shyene88 Apr 28 '14 at 17:31
  • The last error relates to your editor. Turn off using tabs and replace all existing tabs by 4 characters. – Jan Vlcinsky Apr 28 '14 at 17:33
  • I'm still getting the indentation / tab errors. I'm using gedit. Could you please suggest another editor. I'm not quite comfortable using vim. – shyene88 Apr 28 '14 at 17:49
  • I use `vim` but have to admit it takes time to master it (but it is worth). gedit shall be fine. Use `$ cat -T csv2sql.py` to show tabs. Or simply rewrite by specabar all blanks on every line (just to the first non-blank character). But we are getting off-track now. – Jan Vlcinsky Apr 28 '14 at 17:57
  • I tried troubleshooting; but these type of error messages are persistent: Traceback (most recent call last): File "csv2.py", line 5, in args = docopt(__doc__) File "/usr/local/lib/python3.3/dist-packages/docopt.py", line 558, in docopt DocoptExit.usage = printable_usage(doc) File "/usr/local/lib/python3.3/dist-packages/docopt.py", line 466, in printable_usage usage_split = re.split(r'([Uu][Ss][Aa][Gg][Ee]:)', doc) File "/usr/lib/python3.3/re.py", line 191, in split return _compile(pattern, flags).split(string, maxsplit) TypeError: expected string or buffer – shyene88 Apr 29 '14 at 04:48
  • I also tried the steps mentioned in this similar question; but didn't work out. [Link: http://stackoverflow.com/questions/10154633/load-csv-data-into-mysql-in-python] – shyene88 Apr 29 '14 at 05:44
  • Guys, you are amazing people. I admire all your help. – Victor Pudeyev May 06 '14 at 02:47