Lee Duna@lemmy.nz to Technology@lemmy.worldEnglish · 8 个月前Want a more private ChatGPT alternative that runs offline? Check out Janbgr.comexternal-linkmessage-square75fedilinkarrow-up1416arrow-down119
arrow-up1397arrow-down1external-linkWant a more private ChatGPT alternative that runs offline? Check out Janbgr.comLee Duna@lemmy.nz to Technology@lemmy.worldEnglish · 8 个月前message-square75fedilink
minus-square🇸🇵🇪🇨🇺🇱🇦🇹🇪🇷@lemmy.worldlinkfedilinkEnglisharrow-up5·8 个月前I think a good 13B model running on 12GB of VRAM can do pretty well. But I’d be hard pressed to believe anything under 33B would beat 3.5.
minus-squaremiss_brainfart@lemmy.mllinkfedilinkEnglisharrow-up3·edit-28 个月前Asking as someone who doesn’t know anything about any of this: Does more B mean better?
minus-squarealphafalcon@feddit.delinkfedilinkEnglisharrow-up3·8 个月前B stands for Billion (Parameters) IIRC
minus-squarejune@lemmy.worldlinkfedilinkEnglisharrow-up3arrow-down2·8 个月前3.5 fuckin sucks though. That’s a pretty low bar to set imo.
Local LLMs can beat GPT 3.5 now.
I think a good 13B model running on 12GB of VRAM can do pretty well. But I’d be hard pressed to believe anything under 33B would beat 3.5.
Asking as someone who doesn’t know anything about any of this:
Does more B mean better?
B stands for Billion (Parameters) IIRC
3.5 fuckin sucks though. That’s a pretty low bar to set imo.