![]() This would look pretty awful under normal circumstances, so these displays use a little trick called “dithering” to simulate a bit depth of 8 bits per channel. This makes higher bit depth sound pretty pointless, so why are we doing this? Here’s a bit of side info: Most LCD displays (TN panels to be precise) can only represent a bit depth of 6 bits per channel (a mere 64 levels). But: Most graphics cards and display devices don’t allow more than 24 bits per pixel. Now, you can use a higher bit depth for video encoding, and x264 currently allows up to 10 bits per channel (1024 levels and 30 bits per pixel), and of course that allows for much higher precision. There are usually three color channels, so that makes a bit depth of 24 bits per pixel, which is also the most commonly used bit depth of modern desktop PCs. The encodes you’re used to have a precision of 8 bits (256 levels) per color channel. So, why another format, and what makes this stuff different from what you know already?įirst off: What is bit depth? In short, bit depth is the level of precision that’s available for storing color information. For all those who haven’t heard of it already, here’s a quick rundown about the newest trend in making our encodes unplayable on even more systems: So-called high-bit-depth H.264.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |