Can't Have Monitor And HDMI Plugged In At The Same Time?
Nov 27, 2011
I have a Lenovo H405-7223-1GU that I bought in June 2011. Yesterday I bought a Philips 32" HDTV 720p 60Hz. My PC won't recognize my TV at a monitor via HDMI cable. The ports on the TV and the cable is fine because I can connect my DVD player with no issues. Most help is saying to right click desktop, click "screen resolution"and then just tell the PC it's ok to send info over HDMI. That would be fine but I can't have my monitor and HDMI plugged in at the same time. As soon as the HDMI is plugged in my monitor (VGA) goes blank then comes back on with only the background image showing. No start button, no icons, no toolbar
Currently having a lot of trouble to even get any sound out of stereo mix. I'm wanting to record sound from my PC but I'm not too sure if I can with the current set up I have. I'm using my Samsung TV which is plugged into the HDMI port on my PC and using the speakers already in/on the TV. I've went to the properties of stereo mix changed the "playback through this device" to my TV and I don't get any sound coming from Stereo mix on any program and the green bars aren't going up and down either.
I have an Asus K61-IC dual booting Windows 7 Professional x64 (default) & Fedora 16 x64. In Windows, whenever I plug in a HDMI cable to watch a movie on my LCD TV audio will only come out of laptop speakers and when I open audio properties it says HDMI audio not connected or something to that effect (at the moment, I'm not able to connect it to the TV) and it will not let me click on properties, but when I open the driver properties, it tells me everything is working fine. I cannot figure this out for s#%t.
I loaded Windows 7 three computers, 2 desktops and 1 laptop. My Dell XPS is the one that keeps flashing a black screen from time to time. It seems like it is thinking for a few seconds but when this happens it is like my monitor turns off.
Here's my setup: Nvidia video card>HDMI out to Sony Home Theater System>HDMI out to Samsung HDTV. I use this to get 5.1 LPCM surround sound for video games. It worked fine until I did a firmware update on the Sony unit. Now, the HDMI audio is greyed out in Windows 7 audio playback options. Nvidia Control Panel sees the device as audio compatible, but Windows does not.Audio DOES work if I connect a different display (through the same system). I connected an Acer computer monitor and everything worked normally. It also works it I connect it directly to the Samsung TV. The Sony unit works on its own or with the Acer monitor connected. When the monitor is connected, Nvidia Control Panel sees the home theater system as Sony AV System and Audio Playback reports SonyAVSYSTEM. When the Samsung TV is plugged in, Nvidia CP sees the home theater system as Sony Digital Display and Windows audio just says the cable is unplugged (doesn't report anything). I'm getting picture with this setup, but I can't even select the audio option.
I'm going crazy at this point, I have Sony working on it from their end and Samsung insists it's not their fault. Is this a Windows problem? I find it strange that Nvidia CP can recognize the unit and marks it as audio-capable, but Windows says a cable is unplugged. I've tried all recent driver options for both Realtek and Nvidia (Betas and stable). All drivers work the same under different connections described above. That the system works normally unless PC>Theater>TV makes me think that this is a pretty specific breakdown in communication with HDMI!date:Sony representatives got back to me with the findings from their engineers. They insist that because my PS3 works on the setup, it must be an issue with my PC. Their best guess is that the devices are failing to handshake properly over the HDMI connection.
When I plug in or unplug my external display to my laptop, my display options no longer automatically reconfigure as they used to do. Now if I unplug the monitor, Windows acts as if it's still there, and windows that were on the monitor stay offscreen (as does the cursor, if that's where it was). I can manually move the cursor back, right-click load graphics properties and make changes OR do an F-key combo to move everything back on the laptop screen OR program a hotkey but these are all pretty annoying for one reason or another. Unfortunately this must have started happening sometime in the last two weeks--at the same time as I made a load of major computer changes.
However this did NOT include a graphics driver change (my graphics driver is current). At this point I feel it could be anything, but I'm wondering if it could be a recent BIOS update I did that HP recommended. By way of troubleshooting I did run Windows in safe mode with ONLY the "ATI Catalyst" and "AMD External Events utility" running on top of the basic services, in case a newly installed service was stealing hotplugging messages, but that didn't do anything. I've also gone exhaustively through my graphics driver settings to make sure everything there looks correct.
I am having an issue with my Aurora R4 computer. I am trying to play a game using my monitor, but also to have it show on my HD TV. I have the monitor hooked with a DVI connection and out from the same card to the HDMI in the TV.
I just bought a new 23inch Samsung Monitor (S23B370). It have full HD support and also have an HDMI port. I bought a HDMI cable which have HDMI on both ends. I have a VGA (Nvidia 210) which also have an HDMI port. I connected the monitor's HDMI out to the VGA's HDMI port with the new cable and booted. Nothing came. I even set the Monitor's settings to HDMI. But nothing happened. It says check display cable. Why is this? How to fix it? Have I done something wrong?
I bought a Samsung HDMI monitor yesterday (model: S22B370H) for my Z68P-DS3 motherboard which has only an HDMI output. The monitor was working fine when connected to my laptop, although with some flickering. When I connect to the PC, rarely it shows a display till the Windows Startup booting screen, after which it is totally blank. The monitor is recognizing and HDMi input but is not displaying it.
I added speakers to my computer and now I have this message on my monitor "HDMI NO SIGNAL" Everything seems to be working etc. I can get on the Interenet, play viedeos, email and all software is useable. How do I get that message off of my monitor?
I've noticed this anomaly now on two of my Windows 7 rigs, one each Ultimate and Home Premium; Monitors that attach via HDMI are not getting the 'power off' message from the OS. The same monitors hooked up by other means [dvi, vga] do power off just as they should per the power plan settings.
I connected my computer to my monitor via hdmi but there is no sound coming from my monitor. i updated all drivers and even tryed to connect the monitor to a reciever from the head phone jacks. i get sound from my computer to the reciever that way byt not thru the monitor. monitor is e2370 lg and my graphics card is ATI Radeon HD 5450.
I wanted a break from the HDMI sound through the monitor to use headphones, and after several other methods, I tried disabling the HDMI sound. That worked great, but now that option for playback device is completely gone! How do I get it back so I can use the (admittedly weak, but still functional) monitor sound?
I've just built a new computer with Windows 7 and a gigabyte brand radeon HD4890 and have been having the same issue as some other users with regard to the over/underscan.
I managed to correct the inch or so of unfilled space when displaying the desktop by using the under/overscan utility in CCC (thanks to info on this forum ^_^) but found that the problem still exists when I ran my first full-screen applications.
I installed, updated, and ran Spore: Galactic Adventures to take a look at how my system was going to handle everything, but found that the dreaded inch was back. The really odd thing was that when I switched Spore to windowed mode, the inch had gone and I could maximise to full-screen. Of course, the window header and taskbar kind of detract from the game, but this isn't necessarily an issue for Spore (which can hardly be described as immersive).
I am, however, concerned that this issue may pop up in other games where the window header and taskbar will detract from the game (if it operates well in windowed mode at all).
I plan on investigating further later today, possibly in the direction of motherboard and chipset drivers, but also testing with other games.
I have a Toshiba L755 Laptop with Intel Graphics and Media Control Panel. I hooked up the laptop via HDMI cable to a TV monitor. It worked great, but afterwards the screen will not fill up. I can't adjust the overscan and am stuck.
Im interested in setting up a multi monitor setup.My main monitor will be for gaming, and the other will be for web browsing/server admin clients. Could i run one monitor off my GPU and the other off my Core i5's integrated GPU via the built in HDMI connector? or would this cause a confusion with windows? the reason i ask is because i cant see my ATI 6950 handeling both BF3 and another monitor,
My old monitor has problems turning on, I can leave it on for 30hrs before it decides to display a picture, or I can manipulate it by unplugging the DVI cable from the back, but sometimes even that takes a few tries. It's off warranty and for reference it's a Samsung SyncMaster 226BW.So I bought a new monitor and I have it plugged in using HDMI, but a problem I get is that when both monitors are plugged in the HDMI has no display until it gets to Windows, so I cannot see the boot process of my PC, because my old monitor as explained above has problems turning on. Was a hindrance today when I was tinkering with my SSD and had to go to BIOS, and frankly I'd like to be able to see in case anything goes wrong, like a BSOD or Windows update etc.
ive got a problem with my new viewsonic vx2753mh-led monitor. its got 2x hdmi inputs and one vga.the vga works straight off the bat, but whenever I try and plug the hdmi in it just says no input signal detected and goes black I noticed in device manager that this was installed as 'generic non-pnp monitor' so after jumping through some hoops i managed to install the driver as 'viewsonic vx2753 SERIES' anyway this doesnt seem to have made a difference. still no input through the hdmi.I have tested to cord on my samsung tv and it worked fine so it can't be a problem with the cord or the graphics card....just some general info that might be of use:
windows 7 x64 dual monitor setup
the graphics is integrated on the intel i5 chip my display adapters in device manager says Intel(R) HD Graphics Family.it has a dvi out, hdmi out, and vga out.
I have a Samsung R530 running Windows 7 32bit. When I hook it up to my Philips Cineos TV via the HDMI cable, the video comes through perfectly but audio only plays through the laptop speakers. When I went into sound in control panel, this is why I see: When I click on Intel HDMI device, it is greyed out so I can neither set as default or configure. I'm not entirely sure what it means by "not plugged in".This is what comes up under device manager: And this is what I see with the Realtek audio
I've recently purchased a Samsung T23A750 monitor and to my disappointment when I try to watch a movie the built in speakers are not functioning. I have a Sony Vaio Z216GX connected via HDMI to the monitor (HDMI-DVI) port. I can definitely see the video, but no audio is playing from the speakers. The built in speakers are not at fault- I did a self-diagnosis of the monitor and the speakers seem to work fine. I also set the monitor as the default for the audio and even when Win 7 says its functioning properly, the speakers still don't work. Conversely, setting the default to my PC speakers work fine. I'm sure its not the HDMI cable- I used the same one to hook my pc to my tv with everything working fine.
I re-formatted my laptop today, and now the battery icon says "Plugged in, not charging" for most of the time. If I remove the battery/powersupply/reinstall driver it works for a minute or so, then goes back to this.Pretty sure it's software related, as this happened exactly after a windows re-install.
I am looking for a program which can effectively monitor the time a user spends on a computer, perhaps by looking at keystrokes/mouse movements, or something else...either way, I'd like to be able to compare the usage time on several computers. If I could monitor these times remotely, that would be great, but it is not necessary. It would also be great if rather than just saying "user x spend y time on the computer today", it would log it, so say between what times the user was active.
Why does my screen sometimes goes blank after a period of inactivity ( like 8 hours )? And sometimes not!. My monitor screen just turns off ( yellow light ). Then i just turn my mouse and screen goes back to normal. ITS TOTALLY RANDOM!
I disabled the screen saver and I disabled the option of the OS turning off the screen. Yet, the screen will still turn itself off after a period of inactivity. I'm currently using a Asus Monitor 19'
my pc is: 7 3770k 3.5 ghz Asus Gtx 680 Asus Sabertooth Z77 2x4gb ram DDR3 Corsair Vengeance White 1600 LP Arctic Be Quiet Dark Power Pro 1000W drivers 301 WHQL.