Ok, here is my current setup:
I have an NVIDIA GeForce 8600 GT with a DVI-D, HDMI and S-Video port. I have two Acer X223W LCD monitors, which have a DVI-D port and a VGA port, native resolution 1680 x 1050. I have an Insignia HD-TV, with an available HDMI port.
I want both Acers to have independent displays, and the TV to have the same output as one of the Acers. Currently, I have an HDMI to DVI-D cable going from my graphic card to my primary Acer display. I have a DVI splitter splitting up the DVI-D port on the graphic card. I then have a DVI-D to HDMI cable going to the TV from the splitter, and a DVI-D cable going from the splitter to the second monitor.
This works nicely before Windows Vista Home Premium finishes loading. I think NVIDIA kicks in after this. If both the Acer and the TV are plugged into the splitter, I can't use either display. If I unplug them both and plug in the Acer, set it up as the second display, then plug in the TV, I can get both to work, but only at 1024 x 768. If I plug in the TV first, set it up, then the Acer, I can get a higher resolution on the both screens, as long as I don't mess with the settings or open up the NVIDIA control panel. If I have both displays going at once, and I turn off the Acer, the TV does weird stuff (turn the display on and off seemingly randomly, but as soon as I turn the Acer back on, TV stays on fine).
I'd love to have max resolution on both displays, without having to worry about going underneath my desk and plugging stuff in and out everytime I want to do something different. I think the problem is stemming from the graphic card trying to figure out what the displays are, and getting confused when two displays are giving information at the same time, to the same port. Is there a way to tell it there is a splitter there? Or a way for it to not auto-detect the monitors? Thanks in advance!
I blame lag.