• mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    21 days ago

    All neural networks are vastly approximate. The irrelevance of individual values is why quantization and sparsity work.