0

I'm trying to load the huggingface transformers sentiment-analysis model in ipython

from transformers import pipeline
...
sp = pipeline('sentiment-analysis')

Loading the model fails and produces the following output

No model was supplied, defaulted to distilbert-base-uncased-finetuned-sst-2-english (https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english)
HTTPSConnectionPool(host='cdn-lfs.huggingface.co', port=443): Max retries exceeded with url: /distilbert-base-uncased-finetuned-sst-2-english/60554cbd7781b09d87f1ececbea8c064b94e49a7f03fd88e8775bfe6cc3d9f88 (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1129)')))
HTTPSConnectionPool(host='cdn-lfs.huggingface.co', port=443): Max retries exceeded with url: /distilbert-base-uncased-finetuned-sst-2-english/60554cbd7781b09d87f1ececbea8c064b94e49a7f03fd88e8775bfe6cc3d9f88 (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1129)')))
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Input In [2], in <cell line: 1>()
----> 1 sp=pipeline('sentiment-analysis')

File ~/anaconda3/envs/condaenv/lib/python3.9/site-packages/transformers/pipelines/__init__.py:543, in pipeline(task, model, config, tokenizer, feature_extractor, framework, revision, use_fast, use_auth_token, model_kwargs, pipeline_class, **kwargs)
    539 # Infer the framework from the model
    540 # Forced if framework already defined, inferred if it's None
    541 # Will load the correct model if possible
    542 model_classes = {"tf": targeted_task["tf"], "pt": targeted_task["pt"]}
--> 543 framework, model = infer_framework_load_model(
    544     model,
    545     model_classes=model_classes,
    546     config=config,
    547     framework=framework,
    548     revision=revision,
    549     task=task,
    550     **model_kwargs,
    551 )
    553 model_config = model.config
    555 load_tokenizer = type(model_config) in TOKENIZER_MAPPING or model_config.tokenizer_class is not None

File ~/anaconda3/envs/condaenv/lib/python3.9/site-packages/transformers/pipelines/base.py:231, in infer_framework_load_model(model, config, model_classes, task, framework, **model_kwargs)
    228             continue
    230     if isinstance(model, str):
--> 231         raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
    233 framework = "tf" if model.__class__.__name__.startswith("TF") else "pt"
    234 return framework, model

ValueError: Could not load model distilbert-base-uncased-finetuned-sst-2-english with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForSequenceClassification'>, <class 'transformers.models.distilbert.modeling_distilbert.DistilBertForSequenceClassification'>).

I tried removing the ~/.cache/huggingface folder but still getting the same error. I'd appreciate suggestions on what the cause of the issue could be and how to resolve it

TheMonarch
  • 397
  • 4
  • 11
  • For me it looks rather like a network problem because of the `SSLEOFError`. Are you using a proxy or vpn? maybe this discussion offers some help: https://stackoverflow.com/q/69490861/7924573 – tschomacker Apr 28 '22 at 12:15

0 Answers0