Does not work in LM Studio
#6
by
mailxp
- opened
Good Model, I am eager to try it!
But got some error:
Failed to load model
error loading model: llama_model_loader: failed to load model from ...../lm_studio/models/microsoft/bitnet-b1.58-2B-4T-gguf/ggml-model-i2_s.gguf
mac M4 Sequoia
Metal llama.cpp v1.26.0
LM Studio MLX v0.13.2
Same here. Also Mac M4 32GB Sequoia. No problems with loading any other model into LM Studio.
Same. Even if the GPU Offload option is set to 0, the model load fails.
MacBook Pro M3 MAX 64GB Sequoia.
Metal llama.cpp v1.27.1(Beta)
LM Studio MLX v0.13.2
Same. Windows 11,16G. LM Studio 0.3.14
π₯² Failed to load the model
Failed to load model
error loading model: llama_model_loader: failed to load model from D:\AI\Modules\publisher\model\bitnet\bitnet.gguf
Same in Ollama
ollama run hf.co/microsoft/bitnet-b1.58-2B-4T-gguf
Error: unable to load model: C:\Users\User\.ollama\models\blobs\sha256-4221b252fdd5fd25e15847adfeb5ee88886506ba50b8a34548374492884c2162
Same problem with my Mac with MAX
I also have this problem in LM Studio.
same in LM studio 0.3.15
same in Ollama 0.5.4
Hello, With LM studio 0.3.15 i have the same error on my MBpro M1 / 16GB / Sequoia
Does a bitnet runtim exist ?
π₯² Failed to load the model
error loading model: llama_model_loader: failed to load model from /Users/********/.cache/lm-studio/models/microsoft/bitnet-b1.58-2B-4T-gguf/ggml-model-i2_s.gguf