DaGeek247,
@DaGeek247@kbin.social avatar

I consider the 'good enough' level to be, if I didn't pixel peep, I couldn't tell the difference. The visually lossless levels were the first crf levels where I couldn't tell a quality difference even when pixel peeping with imgsli. I also included VAMF results, which say that the quality loss levels are all the same at a pixel level.

I know that av1, x264, and x265 all have different ways of compressing video. Obviously, the whole point of this was to get a better idea of what that actually looked like. Everything on the visually lossless section is completely indistinguishable to my eyes, and everything on the good enough section has very minor bits of compression only noticed when i'm looking for it in a still image. This does not require the same codec to compare and contrast with.

Frankly, for anything other than real-time encoding, I don't actually consider encoding time to be a huge deal. None of my encodes were slower than 3fps on my 5800x3d, which is plenty for running on my media server as overnight job. For real-time encoding, I would just grab a Intel Arc card, and redo the whole thing since the bitrates will be different anyways.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • selfhosted@lemmy.world
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #