cyrano@lemmy.dbzer0.com to Lemmy Shitpost@lemmy.world · 8 days agoAGI achieved 🤖lemmy.dbzer0.comexternal-linkmessage-square253linkfedilinkarrow-up1910arrow-down112cross-posted to: [email protected]
arrow-up1898arrow-down1external-linkAGI achieved 🤖lemmy.dbzer0.comcyrano@lemmy.dbzer0.com to Lemmy Shitpost@lemmy.world · 8 days agomessage-square253linkfedilinkcross-posted to: [email protected]
minus-squareZacryon@feddit.orglinkfedilinkarrow-up2·7 days agoI know that words are tokenized in the vanilla transformer. But do GPT and similar LLMs still do that as well? I assumed they also tokenize on character/symbol level, possibly mixed up with additional abstraction down the chain.
I know that words are tokenized in the vanilla transformer. But do GPT and similar LLMs still do that as well? I assumed they also tokenize on character/symbol level, possibly mixed up with additional abstraction down the chain.