So I am trying to use Haystack to run a model in Google Colab. I am using this tutorial and I am adapting it to my project.
My problem is in the initialization of the reader:
from haystack.nodes import FARMReader
reader = FARMReader(model_name_or_path="deepset/roberta-base-squad2", use_gpu=True)
I get this error:
/usr/local/lib/python3.9/dist-packages/transformers/models/auto/auto_factory.py in <listcomp>(.0)
618 return default
619
--> 620 def __bool__(self):
621 return bool(self.keys())
622
/usr/local/lib/python3.9/dist-packages/transformers/models/auto/auto_factory.py in _load_attr_from_module(self, model_type, attr)
614 def get(self, key, default):
615 try:
--> 616 return self.__getitem__(key)
617 except KeyError:
618 return default
/usr/local/lib/python3.9/dist-packages/transformers/models/auto/auto_factory.py in getattribute_from_module(module, attr)
559 if module != transformers_module:
560 try:
--> 561 return getattribute_from_module(transformers_module, attr)
562 except ValueError:
563 raise ValueError(f"Could not find {attr} neither in {module} nor in {transformers_module}!")
and this:
RuntimeError: Failed to import transformers.models.ernie_m.configuration_ernie_m because of the following error (look up to see its traceback):
No module named 'transformers.models.ernie_m.configuration_ernie_m'
I don't get this error all the time, but only sometimes and there is no pattern. I do not understand how ernie_m is related to roberta-base-squad2 since according to this it's a separate model.
For reference, I already import earlier:
!pip install transformers
import transformers
from transformers import pipeline
and:
from haystack.nodes import FARMReader
from transformers.models.bert.tokenization_bert import BasicTokenizer
The only other similar problem I have found is this and I do not think that it helps my case, right? Thank you in advance for your effort and time.