• mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    13 minutes ago

    All neural networks are vastly approximate. The irrelevance of individual values is why quantization and sparsity work.