I have a python module that gets executed as python -m my_package.my_module --arg1 a1 --arg2 a2
. Which creates an artifact on the local disk and uploads that artifact to a cloud bucket.
My module essentially looks like this
import argparse
def func():
# parse args
# create artifact
# upload to bucket
func()
I'd like to write a test for this module which I can run from a different script, say test_module.py
. I would like to mock the cloud api so nothing gets uploaded, but also provide a set of test command line args to the module.
So far I can do something like this
#test_module.py
from unittest import mock
def test_my_module():
test_args = # [ test args ]
with mock.patch.object(sys, 'argv', test_args):
import my_package.my_module
# test file is written to local dir
The problem here is that I can't mock the cloud api without first importing my_package.my_module
which runs the code in the module and uploads the artifact. And if I do
something like this
#test_module.py
from unittest import mock
import my_package.my_module
@mock.patch('my_package.my_module.cloud_client')
def test_my_module():
# run test
Then the code will execute on the first import statement import my_package.my_module
. Is there a solution where I can mock an object before importing the module it's located in?