• Olap@lemmy.world
    link
    fedilink
    arrow-up
    16
    arrow-down
    1
    ·
    2 days ago

    Patching means rebuilding. And packagers don’t really publish diffs. So it’s use all your bandwidth instead!

    • [object Object]@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      12 hours ago

      With stuff like rsync, diffs can be calculated on the fly. But it requires way more server cpu than just chucking files onto the network.

    • definitemaybe@lemmy.ca
      link
      fedilink
      arrow-up
      29
      ·
      2 days ago

      Which is WAY more economical.

      Rebuilding packages takes a lot of compute. Downloading mostly requires just flashing some very small lights very quickly.

      • cmnybo@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        2 days ago

        If you have multiple computers, you can always set up a caching proxy so you only have to download the packages once.

        • SmoochyPit@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          2 days ago

          That reminds me of Chaotic AUR, though it’s an online public repo. It automatically builds popular AUR packages and lets you download the binaries.

          It sometimes builds against outdated libraries/dependencies though, so for pre-release software I’ve sometimes had to download and compile it locally still. Also you can’t make any patches or move to an old commit, like you can with normal AUR packages.

          I’ve found it’s better to use Arch Linux’s official packages when I can, though, since they always publish binaries built with the same latest-release dependencies. I haven’t had dependency version issues with that, as long as I’ve avoided partial upgrades.