0

I'm trying to load and run the LLAMA2 13B model on my local machine, however I'm not able test any prompts due to an Key Error / Attribute Error (see image attached).

enter image description here

My machine has the following specs:

  • CPU: AMD® Ryzen threadripper 3960x 24-core processor × 48
  • Memory: 128GB
  • GPU: Nvidia Titan RTX

Any ideas?

Thanks in advance! Cheers

DJM
  • 45
  • 4

1 Answers1

0

Found the error myself. Referencing my source here: https://github.com/huggingface/transformers/issues/12503

Input has to be a tensor and not a BatchEncoding.

You can solve this by using:

input.input_ids

or

kwargs as **input

So the solution looks like this:

outputs = model.generate(**inputs, max_new_tokens=20)
Tzane
  • 2,752
  • 1
  • 10
  • 21
DJM
  • 45
  • 4