1

I am using anaconda for some libraries such pandas and I would like to parallelize some operations using pyspark. I am using pycharm IDE and I have been looking a way to get pyspark (or simply spark) features incorporated in the pycharm without loosing anaconda environment, and I haven't found it. I would also like to mention that I have to use pycharm IDE. Does any one has an IDE?

user4237435
  • 333
  • 1
  • 2
  • 12
  • 1
    [How to link PyCharm with PySpark?](http://stackoverflow.com/q/34685905/1560062) – zero323 Mar 12 '17 at 21:14
  • @zero323 Thanks answering. The problem with the link you provided is that if I change the environment to pyspark like they suggest, I will automatically loose the anaconda environment, and I need it too – user4237435 Mar 12 '17 at 21:25
  • 1
    You won't loose anything. You can use existing conda env and edit path. – zero323 Mar 12 '17 at 21:28
  • @zero323 I tried it worked. I however have to give up do the python 3.6 bugs and compatibility problems. Thanks for your answer though – user4237435 Mar 16 '17 at 12:58
  • 1
    Yup, Unreleased 2.1.1 / 2.2.0 are the earliest releases with Python 3.6 support: https://issues.apache.org/jira/browse/SPARK-19019 – zero323 Mar 16 '17 at 18:03

0 Answers0