• njordomir@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    8 hours ago

    Yep, that seems like the ideal decentralized solution. If all the info can be distributed via torrent, anyone with spare disk space can help back up the data and anyone with spare bandwidth can help serve it.

    • Shdwdrgn@mander.xyz
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 hours ago

      Most of us can’t afford the sort of disk capacity they use, but it would be really cool if there were a project to give volunteers pieces of the archive so that information was spread out. Then volunteers could specify if they want to contribute a few gigabytes to multiple terabytes of drive space towards the project and the software could send out packets any time the content changes. Hmm this description sounds familiar but I can’t think of what else might be doing something similar – anyone know of anything like that that could be applied to the archive?

      • njordomir@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 hour ago

        Yeah, the projects I’ve heard about that have done something like this broke it into multiples.

        For example, 1000GB could be broken into forty 25GB torrents and within that, you can tell the client to only download some of the files.

        At scale, a webpage can show the seed/leach numbers and averages foe each torrent over a time period to give an idea of what is well mirrored and what people can shore up. You could also change which torrent is shown as the top download when people go to the contributor page and say they want to help host it ensuring a better distribution.