6

I am trying to use TPU using pytorch_xla, but it shows import error in _XLAC.

!curl https://raw.githubusercontent.com/pytorch/xla/master/contrib/scripts/env-setup.py -o pytorch-xla-env-setup.py
!python pytorch-xla-env-setup.py --version $VERSION

import torch_xla
import torch_xla.core.xla_model as xm

ImportError                               Traceback (most recent call last)
<ipython-input-60-6a19e980152f> in <module>()
----> 1 import torch_xla
      2 import torch_xla.core.xla_model as xm

/usr/local/lib/python3.6/dist-packages/torch_xla/__init__.py in <module>()
     39 import torch
     40 from .version import __version__
---> 41 import _XLAC
     42 
     43 _XLAC._initialize_aten_bindings()

ImportError: /usr/local/lib/python3.6/dist-packages/_XLAC.cpython-36m-x86_64-linux-gnu.so: undefined symbol: _ZN2at6native6einsumENSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEN3c108ArrayRefINS_6TensorEEE
Tshilidzi Mudau
  • 7,373
  • 6
  • 36
  • 49
sharma.37
  • 77
  • 1
  • 2

2 Answers2

1
  1. Make sure you are using the correct versions of pytorch-xla and Python (3.6.9 works good):
curl https://raw.githubusercontent.com/pytorch/xla/master/contrib/scripts/env-setup.py -o pytorch-xla-env-setup.py
python pytorch-xla-env-setup.py --version 20200325
  1. Check that you specify how to access the TPU. You may want to set up "XRT_TPU_CONFIG" or "COLAB_TPU_ADDR" depends on your environment.

Something like:

export XRT_TPU_CONFIG="tpu_worker;0;$TPU_IP_ADDRESS:8470"

Or:

export COLAB_TPU_ADDR="10.16.26.36:8676"

Here is the detailed description: https://github.com/pytorch/xla/blob/master/README.md and example https://cloud.google.com/tpu/docs/tutorials/transformer-pytorch

Also, here is the Google Colab notebook created by PyTorch team, I just tested it, it works OK without any changes: https://colab.research.google.com/github/pytorch/xla/blob/master/contrib/colab/getting-started.ipynb

This notebook will show you how to:

  • Install PyTorch/XLA on Colab, which lets you use PyTorch with TPUs.
  • Run basic PyTorch functions on TPUs.
  • Run PyTorch modules and autograd on TPUs.
  • Run PyTorch networks on TPUs.

You may want to follow one of whose examples and try to reproduce the problem. Good luck!

AI Mechanic
  • 631
  • 7
  • 8
1

Please try this:

!pip uninstall -y torch
!pip install torch==1.8.2+cpu -f https://download.pytorch.org/whl/lts/1.8/torch_lts.html
!pip install -q cloud-tpu-client==0.10 https://storage.googleapis.com/tpu-pytorch/wheels/torch_xla-1.8-cp37-cp37m-linux_x86_64.whl
import torch_xla

It worked for me.

Source: googlecolab/colabtools#2237

Suraj Rao
  • 29,388
  • 11
  • 94
  • 103
Buoy Rina
  • 498
  • 1
  • 4
  • 7
  • Correct! Make sure the package versions between PyTorch and PyTorch/XLA are consistent and aligned with what's recommended on Cloud TPU Colabs. – Milad M Oct 20 '21 at 16:48