I am running my DASK server on hpc where I have all basic necessary modules to run dask and I am loading that module in jupyter notebook. I would like to run some processing task using dask and the modules which are not available in the base environment of dask. For that I have my custom environment created using conda. Is there an easy way to link this new condo environment to the dask client before running my task.
I have tried using
from dask.distributed import Client,LocalCluster
client = Client(scheduler_file=schedule_json)
print(client)
client.upload_file('condaenvfile.tar')
also I have tried using
client.run(os.system,'conda install -c conda-forge package -y')
but still I am getting a message like module not found.
I am making my problem more clear so that I can figure out if there are any other alternatives to handle such issues.
import skimage
import dask.distributed import Client
client=Client(schedule_json)
def myfunc(param):
process using skimage
r=[]
for in [list]:
myres=dask.delayed(myfun)(param)
r.append(myres)
allres=dask.compute(*r)
In the above example, I have dask module running on hpc environment which I don't have any control just I can load that module. I have my own condo environment inside my user profile I have to run some process using skilearn (and other modules) using the dask worker. What would be alternative to work around for such issue?