0

I have a somewhat large R project and would like to identify performance bottlenecks by measuring how long the individual functions take to execute. I know of tools like microbenchmark for checking individual lines of code.

Is it possible to have a tool track the execution times of the individually called functions and then report the results at the end?

In practice, the idea would be that I have a script.R

plot(cars)
plot(cars + 1)

and after running the "benchmark tool" over it, get a result that looks something like this:

plot(cars): 0.1s
plot(cars + 1): 0.1s

How would I go about achieving something like this?

r2evans
  • 141,215
  • 6
  • 77
  • 149
karpfen
  • 489
  • 8
  • 19
  • 3
    There's a whole chapter on profiling in Advanced R. Check it out: https://adv-r.hadley.nz/perf-measure.html – MrFlick Jan 06 '23 at 15:07
  • What you're asking about is called *profiling*. There is at least one package for it in R, `profvis`. The link MrFlick provided is likely one of the better you'll find. – r2evans Jan 06 '23 at 15:35
  • 1
    I think Hadley's book doesn't mention the base function for reporting timing, `summaryRprof()`. For the first pass to look at a big project I prefer it to `profvis` because it quickly shows me which functions (or which lines, if you enable that) are the heavy users. I find that harder to work out with the `profvis` displays, though they are great for looking at the details. – user2554330 Jan 06 '23 at 18:09

0 Answers0