3

Is there a way to instrument the code and find out how much time each namespace takes or the only way forward is to use fixtures?

What is a good approach for solving such a problem?

Amogh Talpallikar
  • 12,084
  • 13
  • 79
  • 135
  • 1
    Why do you need this? For benchmarking? Usually, it requires [some preparation steps](https://github.com/hugoduncan/criterium#criterium), rather than barebones run. – OlegTheCat Oct 19 '16 at 08:42
  • 2
    @OlegTheCat I guess it is not to run performance tests again your production code but to have an insight into which tests are slow so you can improve them. – Piotrek Bzdyl Oct 19 '16 at 08:55
  • OK, what's the problem with fixtures, then? AFAIK, `:once` fixture wraps all tests in the particular namespace. So, one can obtain timespamps before and after run and calculate the difference. – OlegTheCat Oct 19 '16 at 09:56
  • @OlegTheCat I have 145 namespaces and I really don't want to sit and put code in fixtures. Was wondering if using run-tests over list of namespaces would be a good solution? – Amogh Talpallikar Oct 19 '16 at 11:21

2 Answers2

1

Your suggestion of using run-tests over a list of namespaces will work.

Here is a function I wrote to loop ever namespaces in the repl, reload them all, then run tests for each ns. You could modify the code for your purpose. You would have to add code to capture the starting/ending time and print the elapsed time for each ns.

As the docstring says, this code assumes the namespaces are set up to look like:

    myproj.myfeat      ; main namespace
tst.myproj.myfeat      ; testing namespace

so just strip out or modify the part with (symbol (str "tst." curr-ns)) as required for your setup. You'll probably want to use all-ns to get a listing of all namespaces, then either remove or ignore the non-testing namespaces. Good luck!

(defn ^:deprecated ^:no-doc test-all
  "Convenience fn to reload a namespace & the corresponding test namespace from disk and
  execute tests in the REPL.  Assumes canonical project test file organization with
  parallel src/... & test/tst/... directories, where a 'tst.' prefix is added to all src
  namespaces to generate the cooresponding test namespace.  Example:

    (test-all 'tupelo.core 'tupelo.csv)

  This will reload tupelo.core, tst.tupelo.core, tupelo.csv, tst.tupelo.csv and
  then execute clojure.test/run-tests on both of the test namespaces."
  [& ns-list]
  (let [test-ns-list (for [curr-ns ns-list]
                       (let [curr-ns-test (symbol (str "tst." curr-ns))]
                         (println (str "testing " curr-ns " & " curr-ns-test))
                         (require curr-ns curr-ns-test :reload)
                         curr-ns-test))
        ]
    (println "-----------------------------------------------------------------------------")
    (apply clojure.test/run-tests test-ns-list)
    (println "-----------------------------------------------------------------------------")
    (newline)
    ))
Alan Thompson
  • 29,276
  • 6
  • 41
  • 48
1

You can use eftest's :test-warn-time to measure timing of tests per NS.

Once added, you'll see colored (not shown) output like this to indicate a slow test:

LONG TEST in foo-api.handlers.foo-test during :clojure.test/once-fixtures
Test took 12.300 seconds seconds to run
Micah Elliott
  • 9,600
  • 5
  • 51
  • 54