Why won’t my (Windows XP) display settings allow 32bit colour at my native resolution?
August 12, 2005 6:02 AM Subscribe
Why won’t my (Windows XP) display settings allow 32bit colour at my native resolution?
I am trying to run dual Dell UltraSharp 2405FPW 24" LCD Monitors from an ATI Radeon 9600 Pro 128MB under Windows XP (Media Center). I have the latest (Media Center specific) ATI drivers for my card and the drivers for the monitor which downloaded themselves from Windows Update (although I understand it is just a colour scheme (.icm) file). The only thing I’m conscious of not having done is updating my video card’s firmware, as I don’t know where to obtain generic updates and the card’s OEM manufacturer does not seem to provide them.
The trouble is that the display settings (or PowerStrip for that matter) will not let me run the monitors at their native 1920x1200 resolution; I can only go up to a certain point with one monitor’s res. before the second begins to shrink automatically. This is the kind of behaviour I’d expect if I lacked sufficient video card memory for such settings, but I’d have thought 128MB would be enough?
Secondly, even with one of the monitors disabled and using solely the primary unit through a DVI connection, I cannot run at 1920x1200 in 32bit colour, it will only allow 32 bit up to 1360x768, if I drag the slider any further, the colour depth snaps back down to 16bit. I’ve heard talk on forums that the DVI standard is not supposed to be capable of that higher res., this could explain this issue, but it would seem strange if these monitors simply cannot be run in 32bit colour in their native resolution.
Any help would be greatly appreciated.
Cheers!
I am trying to run dual Dell UltraSharp 2405FPW 24" LCD Monitors from an ATI Radeon 9600 Pro 128MB under Windows XP (Media Center). I have the latest (Media Center specific) ATI drivers for my card and the drivers for the monitor which downloaded themselves from Windows Update (although I understand it is just a colour scheme (.icm) file). The only thing I’m conscious of not having done is updating my video card’s firmware, as I don’t know where to obtain generic updates and the card’s OEM manufacturer does not seem to provide them.
The trouble is that the display settings (or PowerStrip for that matter) will not let me run the monitors at their native 1920x1200 resolution; I can only go up to a certain point with one monitor’s res. before the second begins to shrink automatically. This is the kind of behaviour I’d expect if I lacked sufficient video card memory for such settings, but I’d have thought 128MB would be enough?
Secondly, even with one of the monitors disabled and using solely the primary unit through a DVI connection, I cannot run at 1920x1200 in 32bit colour, it will only allow 32 bit up to 1360x768, if I drag the slider any further, the colour depth snaps back down to 16bit. I’ve heard talk on forums that the DVI standard is not supposed to be capable of that higher res., this could explain this issue, but it would seem strange if these monitors simply cannot be run in 32bit colour in their native resolution.
Any help would be greatly appreciated.
Cheers!
Hmm. I'm pretty sure that card can run max resolution on dual monitors.
Are you using Hydravision? That seems to be ATi's special program for handling dual-monitor setups.
Sorry I can't be more helpful... I know very little about Media Center, and I suspect that may be where the problem lies.
posted by selfnoise at 6:32 AM on August 12, 2005
Are you using Hydravision? That seems to be ATi's special program for handling dual-monitor setups.
Sorry I can't be more helpful... I know very little about Media Center, and I suspect that may be where the problem lies.
posted by selfnoise at 6:32 AM on August 12, 2005
Response by poster: OK – sorted.
It turns out that fairly deep in the ATI section of the display settings is a screen in which you can specify what the device on the end of each cable is capable of.
I’ve set both DVI and D-SUB to 1920x1200 and I can now adjust the display settings to their native res. and depths.
This is the second time spent ages trying to sort something out, posted on ask.met and solved it myself 20 seconds later.
Thanks for your help anyway guys!
posted by ed\26h at 7:05 AM on August 12, 2005
It turns out that fairly deep in the ATI section of the display settings is a screen in which you can specify what the device on the end of each cable is capable of.
I’ve set both DVI and D-SUB to 1920x1200 and I can now adjust the display settings to their native res. and depths.
This is the second time spent ages trying to sort something out, posted on ask.met and solved it myself 20 seconds later.
Thanks for your help anyway guys!
posted by ed\26h at 7:05 AM on August 12, 2005
Hey, whatever works. :) Sorry we couldn't be more helpful.
posted by selfnoise at 7:08 AM on August 12, 2005
posted by selfnoise at 7:08 AM on August 12, 2005
I see this often on the help desk. By the time the user has finished explaining the problem they have also come up with the answer. I think it's just the act of slowing your brain down and getting all your ducks in a logical order that does it.
posted by Mitheral at 8:04 AM on August 12, 2005
posted by Mitheral at 8:04 AM on August 12, 2005
One quick comment, most LCDs are limited to 24bit color, i.e. the specification of 16 million colors. Even if you can set it to 32 bit, you aren't nessesarily going to see the difference, as the display may just drop the extra low bits.
posted by Vicarious at 2:35 PM on August 15, 2005
posted by Vicarious at 2:35 PM on August 15, 2005
This thread is closed to new comments.
posted by ROU_Xenophobe at 6:30 AM on August 12, 2005