I need to create data driven unit tests for different APIs in karate framework. The various elements to be passed in the JSON payload should be taken as input from an excel file.
Asked
Active
Viewed 9,473 times
1 Answers
4
A few points:
- I recommend you look at Karate's built-in data-table capabilities, it is far more readable, integrates into your test-script and you won't need to depend on other software. Refer these examples:
call-table.feature
anddynamic-params.feature
- Next I would recommend using JSON instead of an Excel or CSV file, it is natively supported by Karate:
call-json-array.feature
- Finally, if you really wanted to, you can call any Java code and if you return data in a
Map
/List
form, it will be ready for Karate to use. This example shows how to read a database via JDBC:dogs.feature
. So although this is not built into Karate, just write a simple utility to read a CSV or Excel file and you can do pretty much anything Java can do.
EDIT: Karate now supports CSV files that can be used to even do data-driven testing: https://github.com/intuit/karate#csv-files

Peter Thomas
- 54,465
- 21
- 84
- 248
-
Actually my problem statement would be to execute, for example 500 functional test cases on an API. The inputs to these 500 test cases varies and I want to read it from a spreadsheet, store it in a POJO object, then convert it to a JSON payload and pass into the request body. Can you please suggest me if there is a better way to achieve this (ie) perform data driven testing on large voluminous data? Also how should my project structure look like if I need to follow my approach. Where should the code where I read the data from excel, store in a POJO and then generate the payload be present? – Vimal Raj Dec 24 '17 at 07:41
-
2The moment you said "POJO" it is clear you haven't understood Karate enough :) Why don't you take the time to read the documentation ? Yes there is a better way. Don't use Excel. Use Karate's syntax such as `table` with 500 rows. Or use a JSON array with 500 elements. If you still want to go down the path you are saying, then it is up to you - the third point in my answer above. And from experience I know that if you try to fit 500 validations in a "generic" way into one flow (if they are different test scenarios), you are just asking for trouble. ALL THE BEST :) – Peter Thomas Dec 24 '17 at 08:46
-
1For anyone else who stumbles across this post, a couple best practices for test automation. It's NEVER going to make sense to automate every scenario you can think of. The benefit to time cost was likely exceeded by at least 400 tests in this case. Data driven testing should almost ALWAYS use live data, not hard coded values. These can come from api reqests, or db calls. The exception is data that rarely changes, but even in that case it makes more sense to use one of the afor-mentioned methods to generate a file that is updated weekly. – anutter May 25 '21 at 17:59