Well....unexpected for me. I'm hoping one of you knows where I went wrong.
I built my computer about 5 years ago and I don't have any true HDMI outs. I have 2 DVI to HDMI adapters, but I have only ever used the one for my monitor. Tonight I connected my TV to the other output on my computer and immediately lost the signal to my monitor.
The TV recognized that a new HDMI input was connected, but it wasn't showing anything on the screen, and since my monitor was dark as well, I couldn't do anything.
I disconnected the TV, but the monitor stayed dark. I unplugged the monitor and plugged it back in, but it was still dark.
I ended up having to do a hard shut down of my computer. When I turned it back on with only the monitor plugged in, it worked fine. Now I'm afraid to try connecting the TV again.
The TV is in another room and I ran the HDMI cable through the wall. I assumed I could use the Display Settings on my monitor to toggle between the monitor and the TV, but losing the signal to my monitor made that impossible.
Wish you'd listed what graphics card you are using. '
I can think of three possibilities.
One, your graphics card or motherboard if those ports are built into the mobo doesn't support dual monitors, which is not very likely.
The 2nd and more likely possibility, the graphics card's graphics power is insufficient for driving the display of the total monitor resolution. When you add a monitor on top of what you are using, that increases the total monitor resolution (pixel count). If you are using a weak graphics card, its power is not enough at all to drive the total resolution that it bugs out when you connect the 2nd one. Try lowering the resolution of your monitors so that the total monitor resolution equals to your current PC monitor resolution if that's possible. It could work. But things would look bad on your display devices.
Another possibility. Your TV is old and doesn't work well with computer graphics. HDMI on TVs used to be only for AV devices. It's only in the past couple of years that connecting computer graphics to an HDTV became popular. So, older TVs HDMI ports aren't designed with computer graphics in mind.
My graphics card is listed in my specs under my post. Nvidia GTX285. I believe it's 1GB.
The card is 5 years old. My monitor is a 23" at 1920x1080.
The TV is a Sony Bravia 32" 1080i XBR6.
Attached is a picture of the inputs on the back of the TV. I ran into the described problems when I plugged in HDMI 4. After looking closer at the panel, I'm wondering if it would make any difference if I plugged into "PC in".
Ok, so this forum does not make it obvious how to attach a pic. The icon for "insert/edit image" wants a URL. What??? It's a picture on my desktop I took with my iphone. Not something I found online.
That is very strange. The graphics is strong enough for handling two 1920 x 1080 monitors.
And the Nvidia GeForce 200 series supports dual monitiors.
I wonder if the adapter, cable, or the Nvidia graphics port is defective. Have you tested this yet? I mean what happens if you swap the cables between the monitor and TV? What happens if you swap only the adapter? And what happens if you swap the graphics ports on the card that you use for the TV and monitor?
The PC input port is perhaps for D-Sub / VGA. If it works, that means that your card can support dual monitors. But that port may or may not give you 1920 x 1080 resolution. Usually TV's D Sub / VGA port called PC in is lower resolution.
But if your "PC-in" port is an HDMI port, that's the one you should use. Maybe Sony decided to dedicate only one specific HDMI port for hooking up a PC. The difference between connecting an AV device and PC into an HDMI port is that you need a plug and play feed back back from the display device when you hook up a PC whereas AV devices don''t need the plug and play return signal from the display device. If Sony attached one HDMI port with "PC-in" written next to it that's the port that is designed for this use, which in turn means the rest of the HDMI ports are NOT designed for this purpose.
In order to post a picture here, you first have to upload the picture to a picture hosting site, such as tinypic.com, imageshack.com, or photobucket.com ett, get a picture link, and then click on the picture icon above your text box here and put that URL in. Make sure to resize your picture's width to 700 pix or lower or it would not fit in the thread.
Never plug in / unplug a monitor with the PC running. It is not a USB, it is not hot swappable.Turn the PC OFF, plug in the TV, boot the PC. You will them have to configure the second monitor in the NVidia Control panel for it's resolution, MHz and behavior.You may also have to configure your TV's input to accept it's input resolution (refer to your TV's dictation)
PC in would be a better choice if you do not need sound, but I'm pretty sure the GTX2xx series had no sound pass-through anyway.
Also remember TV's tend to use "up-scaling" that means it displays at 1920x1080, but you have to send it 1024x768 (or something) as it's input.
Well, slowly but surely I'm getting this to work.
I connected the TV with an HDMI cable (I don't have the adapter for the "PC in" yet) BEFORE I turned on the computer, and I now have my desktop showing on both my monitor and my TV.
I added the monitor in the Nvidia control panel, configured 1 and 2, and clicked "extend my desktop to this monitor" in the display settings on my desktop.
Now how do I get video (specifically Netflix or any other website) to play on the TV? It only shows the desktop. Not anything else being displayed on my monitor.
I think on the GTX2xx cards you had to either run the Video or game in Windowed mode (make it the size of your screen, and drag it to the other monitor) or buy/ find a program that will do what you want. It's a few generations old.
Hm... perhaps no. I have an older card, 9500GT and it has no problem with my TV and hot plugging the TV is not a problem at all.
Though you have a point. The older gen card may not have as much of universal pair-ability with a vast array of TV HDMI ports. What is just as likely is his TV's HDMI port alone being the problem, judging from the fact that he has to plug in the TV first before booting That says something about the lack of PC monitor like plug and play feed back from the TV side.. I get the sense that his TV's HDMI ports are not designed to work well with computer graphics. Or at least not deigned to work just like a PC monitor.
He hasn't answered if his PC in port is HDMI or VGA/D-Sub. He said something about not having an adapter. Does that mean the PC in port is VGA / D-sub?
Yes, the PC-in is VGA/D-Sub. I currently have it connected to the HDMI-2 input on the TV, but only my desktop shows up. Interestingly, it is actually only my wallpaper, now that I think about it. None of the icons/shortcuts that I have on my monitor desktop show up on the TV. Just the wallpaper.
I used to hear a lot about the issue about 4 years ago when hooking up a computer graphics into a TV HDMI became kinda suddenly popular. Up until then, VGA and DVI had lived in the PC world and HDMI in the TV / AV devices world. Since computer graphics were increasingly coming with HDMI ports, people began to discover the notion of hooking up the computer to television's HDMI. As you can expect, the PC world was ahead of the curve instigating the change in how people use their gadgets, and the TV world lagged behind.
A lot of people used to have issues with older TVs and occasionally even with new TVs. When people bought new like 32" TVs intending to play PC games on them, those TVS occasionally wouldn't work with PCs well; people just could not get it to work right even after trying different switch-on sequences between the TV and computer. When I Googled, I used to find out that other people with that particular TV make / model were having the problem. I recall recommending to someone to swap the new TV they'd just bought for another make/model that was known to work, and it worked for that person. Most of the people already had TVs so they couldn't do anything about it. , but I haven't heard about this issue in the past few years. So, this might be your issue if you bough the TV 4 or 4+ years ago.
Have you tried this sequence by the way?
Keep the TV turned off, connect the HDMI to your PC turn on the PC, and then turn on the TV?
Is it possible that there is a firmware update for your television?
Are you sure that your HDMI "cable" is at least HDMI 1.3b?
I am pretty sure the PC in port works anyway though . Butt that may or may not give you the 1920 x1080 full HD resolution.
Franknj229:Yes, the PC-in is VGA/D-Sub. I currently have it connected to the HDMI-2 input on the TV, but only my desktop shows up. Interestingly, it is actually only my wallpaper, now that I think about it. None of the icons/shortcuts that I have on my monitor desktop show up on the TV. Just the wallpaper.
A good point. Reading it again, I see that he had both the monitor and TV connected when he didn't see the icons.
If he wants everything on the TV, he can designate the TV as his primary monitor (and his PC monitor as secondary) even in the extended desktop mode or run the two devices as duplicates.