0

I have been using my university’s Linux servers to use python for the last few years and have been setting up my personal windows machine. I use to have anaconda installed on this windows machine but uninstalled it and switched to miniconda. Perhaps, I did not uninstall it correctly and that's why I am having issues, but my troubleshooting has repeatedly failed.

On my personal windows machine, I made two separate environments using .yml files, scraper and climate. While in jupyter notebook, I am running into issues where some packages work and some don't and packages are loading for an environment they are not even loaded into.

Both scraper and climate have packages like pandas and numpy, but only scraper has the additional requests and beautifulsoup4 packages. They seem to have been installed properly in their respective environements through inspection in the terminal.

PS C:\Users\name\Documents\energy> cat .\climate_environment.yml
name: cliamte
channels:
  - conda-forge
  - defaults
dependencies:
  - python=3.7
  - pip
  - numpy
  - scipy
  - matplotlib
  - pandas
  - netcdf4
  - xarray
  - dask
  - bottleneck
  - cartopy
  - seaborn
  - cmocean
  - metpy
  - ipykernel
  - jupyter
  - nb_conda_kernels
  - ffmpeg
  - pip:
    - boto3
PS C:\Users\name\Documents\energy\data> cat .\scraper_environment.yml
name: scraper
channels:
  - conda-forge
  - defaults
dependencies:
  - python=3.7
  - pip
  - numpy
  - scipy
  - matplotlib
  - pandas
  - xarray
  - dask
  - bottleneck
  - ipykernel
  - jupyter
  - nb_conda_kernels
  - requests
  - beautifulsoup4

However! In jupyter notebook, if I have scraper or climate kernel activated, pandas and numpy don't work but requests and beautifulsoup4 work. This should not be the case for climate since it does not have requests and beautifulsoup4. And scraper has all 4 packages but only requests and beautifulsoup4 is working.

Upon inspection of my base environment, requests and beautifulsoup4 appear to be installed in there? Perhaps jupyter is looking at base, that for some reason has these two packages from my scraper_environment.yml file?

Looking at the kernel spec list and kernel.json files (from this stack) for my scraper and climate environments shows they seem to be pointing to the correct python executable, is it not?

PS C:\Users\name> jupyter kernelspec list
Available kernels:
  scraper          C:\Users\name\AppData\Roaming\jupyter\kernels\scraper
  climate          C:\Users\name\AppData\Roaming\jupyter\kernels\climate
  python3          C:\Users\name\miniconda3\share\jupyter\kernels\python3
PS C:\Users\name\AppData\Roaming\jupyter\kernels\scraper> cat .\kernel.json
{
 "argv": [
  "C:\\Users\\name\\miniconda3\\python.exe",
  "-m",
  "ipykernel_launcher",
  "-f",
  "{connection_file}"
 ],
 "display_name": "Python (scraper)",
 "language": "python",
 "metadata": {
  "debugger": true
 }
}

I am wondering what the python3 C:\Users\name\miniconda3\share\jupyter\kernels\python3 kernel is, and if this is my base or an artifact from the previous install of anaconda that is messing with me here.

Additionally trouble shooting. I've added nb_conda_kernels to my environments per this stack. Also even tried uninstalling and reinstalling twice, and tried looking into my path environment variables but feel very lost there.

Thank you in advance for your help!! I have spent many hours trying to solve this problem and better my understanding but I do indeed need some help. THANK YOU!!!

EDIT

After my third uninstall and reinstall, I noticed that both the working packages, requests and beautifulsoup4 are installed during conda and jupyter notebook set up (requests during conda and beautifulsoup4 during jupyter notebook). LOL.

I have narrow down my problem to the path C:\\Users\\name\\miniconda3\\python.exe in my kernel.json file. However, when I tried to change it to point to the correct (?) environment C:\\Users\\name\\miniconda3\\envs\\scraper\\python.exe that didn't resolve the issue and the kernel was unable to load*

mcat
  • 75
  • 5

0 Answers0