Pro@programming.dev to Technology@lemmy.worldEnglish · 19 days agoGoogle quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square36linkfedilinkarrow-up128arrow-down14cross-posted to: localllama@sh.itjust.works
arrow-up124arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro@programming.dev to Technology@lemmy.worldEnglish · 19 days agomessage-square36linkfedilinkcross-posted to: localllama@sh.itjust.works
minus-squareAmbiguousProps@lemmy.todaylinkfedilinkEnglisharrow-up4·19 days agoWhy would I use this over Ollama?
minus-squareGreg Clarke@lemmy.calinkfedilinkEnglisharrow-up2arrow-down1·19 days agoOllama can’t run on Android
minus-squareGreg Clarke@lemmy.calinkfedilinkEnglisharrow-up0·19 days agoHas this actually been done? If so, I assume it would only be able to use the CPU
minus-squareEuphoma@lemmy.mllinkfedilinkEnglisharrow-up1·19 days agoYeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
Why would I use this over Ollama?
Ollama can’t run on Android
You can use it in termux
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk