What I really want to do is to get a better video card. Of course one down-side to this is that the default KDE4 desktop background in Debian seems perfectly optimised to make 15bpp modes look ugly, it has a range of shades of blue that look chunky.
Ati es1000 server 2012r2 driver#
It seems that 15bpp doesn’t trigger the display driver bug. My current solution is to use 15bpp mode which gives almost the same quality as 16bpp and uses the same small amount of memory bandwidth. I tried using the gtf to generate new mode lines, but it seems that there is no 24bpp mode which has a low enough vertical refresh rate to not exhaust memory bandwidth but which is also high enough for the monitor to get a signal lock. It seems that the display hardware in my ATI ES1000 (the on-motherboard video card in the Dell server) doesn’t have the memory bandwidth to support 1680*1050*24bpp. I tried using a depth of 24bpp and then I saw messages such as the above in /var/log/Xorg.0.log. (II) RADEON(0): Not using mode “1680×1050” (mode requires too much memory bandwidth) So I generally use 16bpp for my systems to make them run a little faster. As 24bpp is generally implemented with 32bits for each pixel that means it takes twice the frame-buffer storage (both in the X server and in some applications) as well as twice the memory bandwidth to send data around. It turned out that the VESA driver solved that problem, I was tempted to continue using the VESA driver until I realised that the VESA driver has a maximum resolution of 1280*1024 which isn’t suitable for a 1680*1050 resolution display.Īfter reviewing my Xorg configuration file Daniel noted that my frame buffer depth of 16 bits per pixel is regarded as unusual by today’s standards and probably isn’t tested well. The next suggestion was to use the VESA display driver to try and discover whether it was a bug in the ATI driver. I’m still not sure what ICC is about but I know it’s not set on my system. He also suggested that it might be ICC, the command “ xprop -root | grep -i icc” might display something if that was the case.
It’s also worth noting the potential use of this to correct problems with display hardware, I’ve had two Thinkpads turn red towards the end of their lives due to display hardware problems and I now realise I could have worked around the problem with xgamma. This turned out to not be the problem, but it’s worth noting for future instances of such problems. The first suggestion was to check the gamma levels, the program xgamma displays the relative levels of Red, Green, and Blue (the primary colors for monitors) where it is usually expected that all of them will have the value of 1.0. Daniel Pittman offered a lot of great advice. I asked for advice on the LUV mailing list and got a lot of good advice.
Ati es1000 server 2012r2 upgrade#
Unfortunately the result of the upgrade was that everything in an X display looked very green while the console display looked the way it usually did. I’ve just upgraded my Dell PowerEdge T105 from Debian/Lenny to Debian/Squeeze.