This may be a stupid question, but I just got back into pirating some shows and movies and realize that many of the QxR files are much smaller than what I downloaded in the past. Is it likely that I am sacrificing a noticeable amount of quality if I replace my files with the smaller QxR ones?

For example, I have Spirited Away from 2017 at 9.83 GB, but I see the QxR is only 6.1 GB. I also have the office from 2019 and the entire show (no bonus content) is about 442 GB, while the QxR version is only 165.7 GB. Dates are what they are dated on my hard drive, can’t speak to their actual origin, but they would’ve been from RARBG. (Edit to add: I also can’t really speak to the quality of the downloads, back then I was just grabbing whatever was available at a reasonable size, so I wasn’t deliberately seeking out high quality movies and shows - a simple 1080p in the listing was enough for me).

I did some side by side on episodes of the Office (on my PC with headphones, nothing substantial), and I don’t notice any differences between the two.

Thoughts on this? Are people better at ripping/compressing/whatever now that they can do so at a smaller size without sacrificing noticeable quality?

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    84
    arrow-down
    3
    ·
    1 year ago

    Newer codecs are more efficient. H.265 and AV1 are often 2/3 to 1/2 the size of an H.264 file for the same quality.
    Of course there are also people uploading lower quality files as well.

    • BolexForSoup@kbin.social
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      As an editor I loved/hated h.265 until like…a year ago. Some NLE’s dragged their feet on support for some odd reason.

      • empireOfLove@lemmy.one
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 year ago

        Licensing, probably. H.265 is very not open and you have to pay the MPEG piper to actually use it.

          • Laser@feddit.de
            link
            fedilink
            English
            arrow-up
            7
            ·
            1 year ago

            Nothing, the licenses are for content providers and equipment manufacturers, obviously in the end you pay the license when purchasing the goods but the amount is small.

        • BolexForSoup@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 year ago

          They have licensed countless other codecs and tons of cameras adopted it before they supported it. These aren’t some FOSS hobbyist projects. These are professional NLE’s for Hollywood level work. There’s no excuse if you ask me. Hell Resolve had it like 2-3 years prior I believe.

        • BolexForSoup@kbin.social
          link
          fedilink
          arrow-up
          7
          ·
          1 year ago

          I really didn’t get it! When I got my GH5 I was pumped to do 10bit 422 h.265. Really wanted to see the latitude we could get at that compression. Premiere and FCPX in particular for like 5 years went “lol no.”

          • 0x4E4F@infosec.pub
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yeah, beats me, I was surprised as well, like why do we still have to work with AVC clips, why can’t I just import a HEVC clip… Premiere: nope, that ain’t happenin’.

            • BolexForSoup@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              While I have you here, I recently had a project where they shipped me three separate FedEx packages of loose SD cards that were unlabeled, all .mts files. What on gods earth happened on that shoot?

              Oh and the footage was all interlaced I shit you not.

              • 0x4E4F@infosec.pub
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Yeah, I can related to that 😔. I work in a TV station in a country that had the PAL standard, and… well, every fucking shot is interlaced 🤦. Not only that, but the output from the station is as well 🤦. Why? Backwards compatibility… what in the actual fuck 🤦… it’s a stream, the decoder doesn’t care about that, it can decode any frame rate, any frame type, it’s all digital now 🤦. Tried explaining this, nope, we’re still doing interlaced.

                And, of course, after the cable companies have their way with the signal, the output is shit… not to mention they also archive the material as interlaced 🤦… and then people from outside the station complain about the matrial being garbidge… they still don’t budge.

                Just goes to show you what management is all about these days. They have the power, so they’re gonna use it any way they see fit. Why? Cuz they’re THE BOSS GOD DAMN IT 😠.

      • Nawor3565@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        21
        ·
        1 year ago

        It completely depends on the specific video file. HEVC and AV1 are more efficient in general, but most of their benefits become apparent with 4K video, which they were specifically designed to be better at handling than AVC. It also depends on your phone’s software and hardware, as it might not be fast enough to encode in real-time with higher compression settings (and you don’t get to use things like 2-pass encoding which can drastically lower bitrate without sacrificing visual quality).

      • BolexForSoup@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        It can be. Heavily depends on the bitrate, which as a shooter (depending on the camera) I can often control and with a wide array of options at that.

      • roofuskit@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        I assume you mean H.265 recorded on your phone? That is live encoded in a single pass. It doesn’t compress as much in that scenario. When you give a system more time that the real time playback of the video it can encoded things more efficiently.

  • BrownianMotion@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    29
    ·
    edit-2
    1 year ago

    In 2017, most content was h264 and 1080p. This typically made a movie about 10GB with just 5.1 sound. Same movie with DTS 7.1 and possibly 5.1 etc, would be 12-16GB. Today That same 16GB movie with H265 would be 6-8GB.

    The thing is that now that movies are typically 4K and ATMOS etc (which would have been 30+GB in 2017. For the same given “quality” and bitrate settings, that movie would be ~15GB.

    The thing we are seeing now in Usenet/scene releases, is that those quality settings are being pushed up. Due to unlimited internet per month and H265, allowing better quality.

    So with that in mind, the answer to your question is, yes and no. I can give you an example: Fast X. I can see a UHD 4K HDR10 TrueHD for 61GB, and all the way down to 2.5GB!!

    So now you get to have a choice! :D (Oh, and you can also see the traditional H264 1080p as still sitting at the around 10GB, and the basic 4K version using H265 is only 13GB)

  • meseek #2982@lemmy.ca
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Same movie. 1080p. 2h. 6000 Bitrate. AAC 5.1 audio.

    • H264: 8 GB
    • H265: 5 GB
    • AV1: 3 GB
    • koper@feddit.nl
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      You can’t just compare the file sizes without looking at the quality. Each will have different quality loss depending on the exact encodings used.

    • tias@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      18
      ·
      1 year ago

      That makes no sense. The bitrate is how many actual bits per second the data uses after compression, so at the same bitrate all codecs would be the same size.

      • eluvatar@programming.dev
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        3
        ·
        1 year ago

        The bitrate is the rate of the video, not the size of the file. Think of different codecs as different types of compression, like rar vs zip vs 7z

        • tias@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          12
          ·
          edit-2
          1 year ago

          I’m not saying it is the size of the file, I’m saying the bitrate multiplied by the number of seconds determines the size in bits of the file. So for a given video duration and a given bitrate, the total size (modulo headers, container format overhead etc) is the same regardless of compression method. Some codecs can achieve better perceived quality for the same number of bits per second. See. e.g. https://veed.netlify.app/learn/bitrate#TOC1 or https://toolstud.io/video/bitrate.php

          If it’s compressed to 6,000 kilobits per second then ten seconds of video will be 60,000 kilobits or 7 megabytes, regardless if it’s compressed with h.264, h.265 or AV1.

        • meseek #2982@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Yeah my data is definitely an oversimplification. Raw bitrate doesn’t mean the same between them because they compress differently. I tried to control for that as best I could so it wasn’t the bitrate that was saving file size but the efficiency of the codec.

          It’s like a fuzzy start line 🤷‍♂️

          • tias@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            1 year ago

            As I’ve said elsewhere, raw bitrate means exactly the same between them, because the bitrate is the number of bits per second of video after compression. What you mean is that you set a target bitrate and the different codecs have varying success in meeting that target. You can use two-pass encoding to improve the codec’s accuracy.

            But what matters is the average bitrate required by each codec to achieve the desired level of video quality, as perceived by you. The lower bitrate you need for the quality you want, the better the codec is.

      • DaGeek247@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        op is describing the source video file bitrate, not the target codec bitrate. 6000kbps compresses to different amounts depending on the codec and quality used. Op doesnt mention the quality factor for the codecs, so this is less than helpful.

  • thisNotMyName@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    1 year ago

    Your old stuff is most likely in x264 video codec, while, especially at the higher resolutions, x265 / HEVC and in rare cases AV1 are the standard today. But it also depends on the specific release how many streams (like audio tracks, subtitles) are included

    • Overspark@feddit.nl
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      1 year ago

      Be warned though, some x265 stuff out there, particularly at 1080p and lower, is a reencode of a x264 source file. So lower filesize, but also slightly lower quality. Scene regulations say only higher resolutions should be x265.

      • thisNotMyName@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I still prefer it, HDDs aren’t free and I personally really can’t tell the difference (my TV kinda sucks anyway)

  • lemonlemonlemon@lemm.eeOP
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    Thanks for all of the replies, they were very insightful!

    The H264 to H265 appears to account for the majority of differences that I was seeing in file sizes.

  • stifle867@programming.dev
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    Also it’s common for anime to be encoded in 10-bit color rather than 8-bit which can also be used to encode files more efficiently.

  • antlion@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    It’s quite remarkable really. A single layer DVD stores 4.7 GB, for a movie with 576p (H.262). A while later those videos could be compressed using DivX or Xvid (H.263) down to 700 MB to fit on a standard CD, though full quality was more like 2 GB.

    The Blu-ray standard came along with 25 GB per layer, and 1080p video, stored in H.262 or H.264.

    Discs encoded in MPEG-2 video typically limit content producers to around two hours of high-definition content on a single-layer (25 GB) BD-ROM. The more-advanced video formats (VC-1 and MPEG-4 AVC) typically achieve a video run time twice that of MPEG-2, with comparable quality. MPEG-2, however, does have the advantage that it is available without licensing costs, as all MPEG-2 patents have expired.

    Now H.265 is now even smaller than H.264, so now you could record a full 1080p movie onto a 4.7 GB DVD. Now the Ultra HD Blu-ray Discs are only slightly larger (33 GB per layer), but they store 4K video by supporting H.265 codec. I guess by now a 720p video encoded to H.265 could make a decent copy on a 700 MB CD.

    • roofuskit@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      You’re right except for that last part. The newer smaller file size video codecs are really only effective on higher resolution video. So a 720p movie encoded with H.265 to fit on a 700MB CD isn’t going to look much better if at all than older codecs (maybe better than DivX). H.265 really shines at 4K and up but does offer some benefit at 1080p.

      • antlion@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That is interesting. Of course there aren’t any HDMI CD Video players so it doesn’t much matter. But it’s interesting how a 4 GB DVD in H.262 would compare to a 1080p copy of the same movie in H.265.

        I wonder if there’s a lot of room for encoders to improve the quality per byte without changing the format. For instance jpeg and mozjpeg.

        • WarmApplePieShrek@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          There’s so much room that format specifications don’t tell you how to encode, only how to decode. Designing the best encoder is a huge research project.