mosc wrote:is it so bad to run the TV off of the S-video output?
Another brilliant option is to buy another graphics card (PCI would work fine) that has a DVI port.
quizme2000 wrote:I have a similar setup.
Cables: Plug your primary LCD into the DVI-D port and leave it alone. Then run a HDMI cable from the graphics card to a 1X2 Powered HDMI Splitter (monprice approx $70) Then connect a HDMI to DVI-D to the second LCD and an HDMI cable to the HDTV. Your HDTV will replicate the LCD pluged in to the splitter.
Config: Open the NVIDA control panel goto "select task/display/setup multiple displays". Select "Dualview" and Select the Primary LCD. Then go to the windows display control panel to set the resolutions for the Primary LCD and the LCD/HDTV
quizme2000 wrote:[cite]Is there a reason that this setup would work whereas mine doesn't? I'm just asking because I've already sunk a fair amount of money in. I think the difference between your setup and mine is that I'm splitting the DVI instead of HDMI, and the splitter isn't powered. Thanks!![/cite]
It could be that the max/supported resolution for your HDTV isn't able to display 1680 x 1050 @ 60Hz even though it may fall within a 1080p/i spec but not 720p. Remember the Windows Vista boot screen is 800x600 or 1024x768 (in high-res mode). When windows hands off the display to the nvida driver, the driver then sets your selected resolution which may not be supported by the HDTV set.
quizme2000 wrote:BTW the video cards we (consumers) use won't be able to detect multiple displays connected to one output, only one display per port.
That's what I've been thinking, too, since I can get 1824x1024 at 60 hz on the TV if it's the only display hooked up to the splitter. I'm thinking if DVI-D uses two way communication (which it must, if it can detect what monitor/TV is plugged in) then it must be getting confused when there are two inputs sending back information at the same time. That's why I was hoping it was possible for it not to actually try and detect, but let me tell it what to display. I'd be happy with 1680x1050, 60 Hz on all screens. It's not quite 1080p but c'est la vie.
I tried to use 2 video cards with XP once. It was a massive pain and I quickly gave up.mosc wrote:Another brilliant option is to buy another graphics card (PCI would work fine) that has a DVI port.
poxic wrote:You suck. And simultaneously rock. I think you've invented a new state of being.
mosc wrote:Is this where I bring up the fact that DVD is only 480p and that the vast majority blu-ray movies released today are simple upscans. Any older movie is not going to be in that resolution. SVIDEO works fine for DVD quality which is more than you'll get from anything pre-2006 (any many after).
Also we haven't mentioned the native resolution of the TV? Most HDTVs on the market (especially if it's not a brand new TV) are natively 720 tall, not 1080.
Anyway, I still say the best way to handle this is a additional graphics head (another video card).
Shai wrote:...I don't think I have any more PCI slots left, there's one PCI-Express but I don't want to use that up with another graphic card...
Users browsing this forum: shealtket and 4 guests