• FiniteBanjo@feddit.online
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 days ago

    Is this why they’re trying to obtain so damn much memory components? So they can ship the entire data relied upon by the statistical models to consumers?

    I really didn’t think they could be more wasteful but clearly I was wrong, they want to eat up bandwidth like a fish drinks water. Or maybe they want to actually SHIP the DATABASE?

    Here is what will change if your AI could remember better: nothing. The LLMs are statistical models. The next most likely token based on the sample data would rarely if ever change based on a new prompt being memorized. Normally, the models need to be retrained whenever a change is made to the data, if you could bypass that you will probably lose accuracy but even if you didn’t then you still won’t gain any tangible accuracy.