Pro@programming.dev to Technology@lemmy.worldEnglish · 7 days agoGoogle quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square47fedilinkarrow-up1244arrow-down126
arrow-up1218arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro@programming.dev to Technology@lemmy.worldEnglish · 7 days agomessage-square47fedilink
minus-squareAmbiguousProps@lemmy.todaylinkfedilinkEnglisharrow-up23·7 days agoThat’s fair, but I think I’d rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
minus-squareGreg Clarke@lemmy.calinkfedilinkEnglisharrow-up4·7 days agoYes, that’s my setup. But this will be useful for cases where internet connection is not reliable
minus-squareOhVenus_Baby@lemmy.mllinkfedilinkEnglisharrow-up2·6 days agoHow is Ollama compared to GPT models? I used the paid tier for work and I’m curious how this stacks up.
minus-squareAmbiguousProps@lemmy.todaylinkfedilinkEnglisharrow-up1·6 days agoIt’s decent, with the deepseek model anyway. It’s not as fast and has a lower parameter count though. You might just need to try it and see if it fits your needs or not.
That’s fair, but I think I’d rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
Yes, that’s my setup. But this will be useful for cases where internet connection is not reliable
How is Ollama compared to GPT models? I used the paid tier for work and I’m curious how this stacks up.
It’s decent, with the deepseek model anyway. It’s not as fast and has a lower parameter count though. You might just need to try it and see if it fits your needs or not.