Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

trying to use ollama to local host, and keep getting "Failed to complete action. Please try again."

(things of note: i have no idea what im doing, and im cheap XD )

I don’t think you need to put .Q4_K_M, just use the original model name. Also if that doesn’t work I recommend switching to LMStudio (latest beta) or maybe koboldcpp. I’ve heard many complaints that ollama API doesn’t work.