runtime error
Exit code: 1. Reason: /usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py:795: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. warnings.warn( config.json: 0%| | 0.00/762 [00:00<?, ?B/s][A config.json: 100%|██████████| 762/762 [00:00<00:00, 4.67MB/s] model.safetensors: 0%| | 0.00/353M [00:00<?, ?B/s][A model.safetensors: 0%| | 0.00/353M [00:05<?, ?B/s] model.safetensors: 0%| | 0.00/353M [00:00<?, ?B/s][A model.safetensors: 49%|████▉ | 175M/353M [00:01<00:01, 136MB/s][A model.safetensors: 52%|█████▏ | 185M/353M [00:05<00:04, 36.3MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 6, in <module> generator = pipeline('text-generation', model='distilgpt2', max_length=25) # Reduced max_length for faster inference File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 788, in pipeline framework, model = infer_framework_load_model( File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 278, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model distilgpt2 with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt2.modeling_gpt2.GPT2LMHeadModel'>).
Container logs:
Fetching error logs...