Tea@programming.dev to Technology@lemmy.worldEnglish · 1 month agoNvidia creates gaming-centric AI chatbot that runs on your GPU, locally.www.nvidia.comexternal-linkmessage-square14linkfedilinkarrow-up160arrow-down114
arrow-up146arrow-down1external-linkNvidia creates gaming-centric AI chatbot that runs on your GPU, locally.www.nvidia.comTea@programming.dev to Technology@lemmy.worldEnglish · 1 month agomessage-square14linkfedilink
minus-squareWolfLink@sh.itjust.workslinkfedilinkEnglisharrow-up21·1 month agoYou can run your own LLM chatbot with https://ollama.com/ They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.
You can run your own LLM chatbot with https://ollama.com/
They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.