I have multiple test files in the test folder. The structure is similar to something like this:
/test
----test_abc.py
----test_bcd.py
----test_cde.py
----conftest.py
The conftest.py contains all the spark context initialization which is necessary for running the unit test. My problem is I would like to have a test.py
file which internally triggers all the test_abc.py
, test_bcd.py
and test_cde.py
. It becomes very easy when we deal with utit_test module of python but I am not sure how to obtain this through pytest module. Do let me know if any more clarification required on the question.
The conftest.py looks something like this:
import pytest
from pyspark import SQLContext
from pyspark import SparkConf
from pyspark import SparkContext
from pyspark.streaming import StreamingContext
@pytest.fixture(scope="session")
def spark_context(request):
conf = (SparkConf().setMaster("local[2]").setAppName("pytest-pyspark-local-testing"))
request.addfinalizer(lambda: sc.stop())
sc = SparkContext(conf=conf).getOrCreate()
return sc
And one of the test_abc.py looks something like this:
import pytest
import os
from pyspark.sql import SQLContext
pytestmark = pytest.mark.usefixtures("spark_context")
def test_get_pc_browser_sql(spark_context):
"assert something"