[Tech Talk] Understanding Video Encoding Definition and Adaptive Bitrate

0
1091
[Tech Talk] Understanding Video Bitrates
[Tech Talk] Understanding Video Bitrates

A friend of mine who goes by NawkCire online (or Shooter-90) asked me yesterday on Twitter for a quick explanation of video bitrates and to what degree the bitrate actually matters. This is really a good question when it comes to wanting to put out higher quality videos on YouTube without having them be lossy or overly-compressed, and while YouTube itself does have bitrate guides that can point you in the right direction, these are constantly changing and it’s hard to figure out what a good baseline might be.

For starters, it’s important to understand that bitrate is what determines how much data is in a frame of a video. This actually breaks down into how much data per second of video there is. So, a 4000 kbps video would have 4000 kilobits of data per second of video.

Due to the relation of bits to bytes, you will want to divide this number by 8 to get a data value in terms of KB/s, though. Yeah, I know, that’s confusing. However, what this means is that the higher the bitrate is, the higher quality the video itself will be. But, there are conceivable limits to this, though, beyond which it makes less sense to worry about.

YouTube recommends that a normal 720p video not exceed a bitrate of 5000 kbps. Most likely, if you upload a video at 1280×720 and it is well above this number, it will get re-encoded and compressed back into that general ballpark. For 720p at 60 frames-per-second, though, they recommend around 7500 kbps.

The problem with this recommendation, though, is that at 60 frames-per-second you have twice the data present per second of video, so if you only increase the bitrate by 50%, the output, while smoother, will be more lossy than if you had used that data-rate with a 30 frame-per-second video. So, because of this, as a general rule I would recommend that if you double the frame-rate of your videos, you also double the allowed bitrate to match this.

In the past, I tested out the bitrates that YouTube would compress my videos too, and this seems like a decent starting place for minimal bitrates to use (higher is, of course, just fine!):

  • 1280×720 @ 30 FPS (720p30): 2,000 kbps minimum
  • 1280×720 @ 60 FPS (720p60): 4,000 kbps minimum
  • 1920×1080 @ 30 FPS (1080p30): 4,000 kbps minimum
  • 1920×1080 @ 60 FPS (1080p60): 8,000 kbps minimum

Using those bitrates as a general guideline, you should be able to produce nice, smooth videos with crisp visuals. They may not be 100% without artifacts, but they should be very high quality anyway. Of course, if you find that your videos are too lossy looking with those settings, feel free to tweak them and raise them a bit.

Keep in mind that the codec you use has a significant impact on both the overall compression ratio and the visual quality of the output file. MP4, for instance, is a lossy codec meant mainly for web-compression and as such it will look quite worse than if you were using H.264 as your codec standard.

[Tech Talk] Understanding Video Bitrates
[Tech Talk] Understanding Video Bitrates

H264 App

Here’s a wildcard factor to consider though: Most video editing and encoding software will automatically select a good bitrate for you during the encoding process. This is essentially like the Variable Bitrate (VBR) settings in various recording tools because this allows the software to determine how much data is necessary (and how much is excessive) per frame of video that it encodes.

In one of the programs I use, which allows for H.264 to be selected as your encoding codec (which is a great standard for now), I usually select a Quality Rate Factor (or QFR…Yep, these people love their acronyms!) of 10.0, which is the lowest it’ll let me go in that case (some apps may allow you to go all the way down to 0, but that will be huge lossless files!).

This generally produces a high-quality file that is still compressed compared to the source, resulting in a medium-size file with good quality (around 1/5 or less the size of the source).

For instance, if I take a semi-lossless file for a 70-minute video, which would be approximately 32GB of data, that same file, after re-encoding and editing, will end up at around 6GB, which is much more manageable for YouTube uploading.

But, that’s also for a video that YouTube will treat as 1440p60, and so that’s much larger than what you might expect if you were dealing with 720p or 1080p (whether 30 or 60 FPS), so don’t be blown away if you think that’s a big file.

So, here’s the short version: Yes, bitrate matters, but there is a diminishing return after a certain point. However, a 720p video at 8000 kbps will probably look a fair bit better than a 1080p video at just 4000. However, higher is only better to a point.

You can worry about encoding to fit the supposed recommendations of YouTube, or you can go bigger or small depending on your needs (be that high-quality or better compression to save on network usage).

One final tip: YouTube’s playback bitrate (i.e. quality) is determined by what resolution it encodes the video too. So, if you have a very high quality 1080p video that appears to get over-compressed by YouTube, one “trick” is to consider scaling that video to a higher resolution, such as 1440p (roughly twice the resolution of 1080p).

While this would only marginally improve the quality itself played locally (unless you were playing it on a 1440p display, in which case it would look a tad crisper), it would force YouTube to encode it to 1440p, which would mean that it could playback at a higher bitrate.

So, you’d basically have your 1080p video played back within 1440p specifications by YouTube, and thus the quality would be much higher and therefore closer to what 1080 looked like on your source computer.

Read More: Breeders Of The Nephelym Cheats

LEAVE A REPLY

Please enter your comment!
Please enter your name here