SaltyIceteaMaker@lemmy.ml to Linux@lemmy.ml · 10 months agoany cool ideas what i could do with termux?lemmy.mlimagemessage-square84fedilinkarrow-up1194arrow-down18
arrow-up1186arrow-down1imageany cool ideas what i could do with termux?lemmy.mlSaltyIceteaMaker@lemmy.ml to Linux@lemmy.ml · 10 months agomessage-square84fedilink
minus-squareacec@lemmy.worldlinkfedilinkarrow-up15arrow-down1·10 months agoCompile llama.cpp, download a small GGML LLM model and you will have a quite intelligent assiatant running into your phone.
minus-squarebassomitron@lemmy.worldlinkfedilinkEnglisharrow-up5·10 months agoWould that actually be decent? Even 6b models feel way too rudimentary after experiencing 33+b models and/or chatgpt. I haven’t tried those really scaled down and optimized models, though!
Compile llama.cpp, download a small GGML LLM model and you will have a quite intelligent assiatant running into your phone.
Would that actually be decent? Even 6b models feel way too rudimentary after experiencing 33+b models and/or chatgpt. I haven’t tried those really scaled down and optimized models, though!