Pro to Technology@lemmy.worldEnglish • 2 months agoGoogle quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square48fedilinkarrow-up1247arrow-down125
arrow-up1222arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro to Technology@lemmy.worldEnglish • 2 months agomessage-square48fedilink
minus-squareGreg ClarkelinkfedilinkEnglish3•2 months agoHas this actually been done? If so, I assume it would only be able to use the CPU
minus-square@Euphoma@lemmy.mllinkfedilinkEnglish7•2 months agoYeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
You can use it in termux
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk