corbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 6 days agocan't beat the classicsinfosec.pubimagemessage-square43linkfedilinkarrow-up1437arrow-down111
arrow-up1426arrow-down1imagecan't beat the classicsinfosec.pubcorbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 6 days agomessage-square43linkfedilink
minus-squarelmuel@sopuli.xyzlinkfedilinkEnglisharrow-up8·6 days agoWell in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs. The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.
Well in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs.
The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.