• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    19 hours ago

    LLM inference requires very specific servers that aren’t good for much else (in terms of what companies usually do), though. And go ‘obsolete’ even more quickly.

    I guess what I’m saying is the premise would be pretty flimsy for a more general upgrade.