9/28/2023 0 Comments 366x768 runescape image![]() ħ20p is just under 1 megapixel of data per screen. Makes for cost productive configurations where the Input / Output systems are built off of already available OEM devices, so basically the Manufacturer is more in the business of flatpanel Glass making and bezel/speaker situations on a large display. Its a standard memory size of importance to chip makers. This has to do with a 1 megapixel processing boundary of easily available chipsets for VRAM ( video memory ) and video processing display drivers. I had the same question in the 2007, because my computer doesn't supported my default tv resolution 1366x768 and I found this: ![]() Why 1360? Because you can divide it by 8 (or even 16) which is way more simple to handle when processing graphics and could bring to optimized algorithms. That's why something a bit lower that 1366 was taken. 1366x768 8bit pixels would take just above 1MiB to be stored (1024.5KiB) so that wouldn't fit in 8Mbit memory chip, you would have to take a 16Mbit one just to store a few pixels. WXGA can also refer to a 1360x768 resolution (and some others less common), which was made to reduce costs in integrated circuits. However, the standard aspect ratio for wide display was 16/9, which isn't possible with 768 pixels on width, so the nearest value was choosen, 1366x768. Just extending the width and keeping the same height was also more simple technically because you would only have to tweak the horizontal refresh rate timing to achieve it. So, for simplicity and backward compatibility, the XGA resolution was kept as a basis to make the WXGA resolution, so that XGA graphics could be displayed easily on WXGA screens. ![]() At the time the first computer wide screens became popular, the usual resolution on 4:3 panels was 1024x768 (XGA display standard). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |