hi, when i run it on my Xeon E5, i have error (check screenshot)
On another pc (with intel i5), it works
Is it 32 bit or 64? It's probably a issue with llama.cpp. I intend to fix up the llm loader to be openai standard so people can use offloaded servers if they want.
the both are windows 10 x64