I wish to plot the time against size of input, for Longest common subsequence problem in recursive as well as dynamic programming approaches. Until now I've developed programs for evaluating lcs functions in both ways, a simple random string generator(with help from here) and a program to plot the graph. Now I need to connect all these in the following way.
Now I have to connect all these. That is, the two programs for calculating lcs should run about 10 times with output from simple random string generator given as command line arguments to these programs.
The time taken for execution of these programs are calculated and this along with the length of strings used is stored in a file like
l=15, r=0.003, c=0.001
This is parsed by the python program to populate the following lists
sequence_lengths = []
recursive_times = []
dynamic_times = []
and then the graph is plotted. I've the following questions regarding above.
1) How do I pass the output of one C program to another C program as command line arguments?
2) Is there any function to evaluate the time taken to execute the function in microseconds? Presently the only option I have is time function in unix. Being a command-line utility makes it tougher to handle.
Any help would be much appreciated.