0

So I want to use the Large Language Model from huggingface, for example falcon. So I know you can use

from transformers import  AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("tiiuae/falcon-7b",
 trust_remote_code=True)

to load the model, but I want to clone the repository or download the weight, so I can use the weight on other virtual enviroment to. But currently I run to this error, when I try to load the model from the cloned repostory

model = AutoModelForCausalLM.from_pretrained( './falcon-7b/')

I have this error:

945 has_remote_code = "auto_map" in config_dict and "AutoConfig" in config_dict["auto_map"]
    946 has_local_code = "model_type" in config_dict and config_dict["model_type"] in CONFIG_MAPPING
--> 947 trust_remote_code = resolve_trust_remote_code(
    948     trust_remote_code, pretrained_model_name_or_path, has_local_code, has_remote_code
    949 )
    951 if has_remote_code and trust_remote_code:
...
--> 535     signal.signal(signal.SIGALRM, _raise_timeout_error)
    536     signal.alarm(TIME_OUT_REMOTE_CODE)
    537     while trust_remote_code is None:

AttributeError: module 'signal' has no attribute 'SIGALRM'`

Do I have to do anything with the weights first?

1 Answers1

0

It might be that you are using the signal module on Windows.
The signal module has no attribute SIGALRM on windows as the signal is only available on linux. See also this question

Kevin Spaghetti
  • 620
  • 4
  • 16