I have a spark-scala application and I want to use a function that can only be written in python as it uses NLTK package. My problem is that how would I provide the Nltk package to the project ,should I provide it in dependencies ,if yes then how ??
Because when I write the code using nltk package in python inside the same project it gives me an error that package nltk not found.
I know that we can use Pipe for using the python function in scala spark. but how would I add the nltk package in the same application.
Any help is appreciated !