• BlameTheAntifa@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    3 hours ago

    The difference is that we’ll just be running small, specialized, on-demand models instead of huge, resource-heavy, all-purpose models. It’s already being done. Just look at how Google and Apple are approaching AI on mobile devices. You don’t need a lot of power for that, just plenty of storage.