0

I am trying to run a text summarization "t5-base" model. The code used to work when I first ran it but after installing/reinstalling some packages, it no longer works. Can anyone please tell me how to resolve this issue?

Here is my code:

import torch
from transformers import AutoModel, AutoTokenizer 

tokenizer = AutoTokenizer.from_pretrained('t5-base')
model = AutoModelWithLMHead.from_pretrained('t5-base', return_dict=True)

inputs = tokenizer.encode("summarize: " + text,
                          return_tensors='pt',
                          max_length=512,
                          truncation=True)
summary_ids = model.generate(inputs, max_length=150, min_length=80, length_penalty=5., num_beams=2)
text = tokenizer.decode(summary_ids[0])
text = text.replace("<pad>","").replace("</s>","")
text

Below is the error message I get:

    ---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-46-2c9eeafa599f> in <module>
      1 import torch
----> 2 from transformers import AutoModel, AutoTokenizer

~/opt/anaconda3/lib/python3.7/site-packages/transformers/__init__.py in <module>
     29 # Check the dependencies satisfy the minimal versions required.
     30 from . import dependency_versions_check
---> 31 from .utils import (
     32     _LazyModule,
     33     is_flax_available,

ImportError: cannot import name '_LazyModule' from 'transformers.utils' (/Users/sangjinlee/opt/anaconda3/lib/python3.7/site-packages/transformers/utils/__init__.py)
dbug
  • 1
  • 1
  • 1
  • Starting from a fresh environment can be a good final choice but I suggest running `pip install 'lightning-flash[text]' --upgrade` in the first place – meti Apr 17 '22 at 15:06

0 Answers0