runtime error

Exit code: 1. Reason: [A model-00006-of-00006.safetensors: 16%|β–ˆβ–‹ | 811M/4.99G [00:06<00:29, 140MB/s] model-00006-of-00006.safetensors: 26%|β–ˆβ–ˆβ–Œ | 1.28G/4.99G [00:07<00:18, 199MB/s] model-00006-of-00006.safetensors: 42%|β–ˆβ–ˆβ–ˆβ–ˆβ– | 2.08G/4.99G [00:08<00:08, 342MB/s] model-00006-of-00006.safetensors: 54%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 2.71G/4.99G [00:09<00:05, 393MB/s] model-00006-of-00006.safetensors: 73%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 3.65G/4.99G [00:10<00:02, 522MB/s] model-00006-of-00006.safetensors: 85%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 4.25G/4.99G [00:12<00:01, 527MB/s] model-00006-of-00006.safetensors: 97%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹| 4.85G/4.99G [00:13<00:00, 511MB/s] model-00006-of-00006.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4.99G/4.99G [00:13<00:00, 368MB/s] Loading checkpoint shards: 0%| | 0/6 [00:00<?, ?it/s] Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 6/6 [00:00<00:00, 96791.63it/s] generation_config.json: 0%| | 0.00/222 [00:00<?, ?B/s] generation_config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 222/222 [00:00<00:00, 1.65MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 11, in <module> phi4_model = AutoModelForCausalLM.from_pretrained(phi4_model_path, device_map="auto", torch_dtype="auto") File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 604, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 288, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 5279, in from_pretrained dispatch_model(model, **device_map_kwargs) File "/usr/local/lib/python3.10/site-packages/accelerate/big_modeling.py", line 504, in dispatch_model raise ValueError( ValueError: You are trying to offload the whole model to the disk. Please use the `disk_offload` function instead.

Container logs:

Fetching error logs...