• ricdeh@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    5 days ago

    Untrue. There are small models that produce better output than the previous “flagships” like GPT-2. Also, you can achieve much more than we currently do with far less energy by working on novel, specialised hardware (neuromorphic computing).