I just bought a new 23inch Samsung Monitor (S23B370). It have full HD support and also have an HDMI port. I bought a HDMI cable which have HDMI on both ends. I have a VGA (Nvidia 210) which also have an HDMI port. I connected the monitor's HDMI out to the VGA's HDMI port with the new cable and booted. Nothing came. I even set the Monitor's settings to HDMI. But nothing happened. It says check display cable. Why is this? How to fix it? Have I done something wrong?
I bought a Samsung HDMI monitor yesterday (model: S22B370H) for my Z68P-DS3 motherboard which has only an HDMI output. The monitor was working fine when connected to my laptop, although with some flickering. When I connect to the PC, rarely it shows a display till the Windows Startup booting screen, after which it is totally blank. The monitor is recognizing and HDMi input but is not displaying it.
I've recently purchased a Samsung T23A750 monitor and to my disappointment when I try to watch a movie the built in speakers are not functioning. I have a Sony Vaio Z216GX connected via HDMI to the monitor (HDMI-DVI) port. I can definitely see the video, but no audio is playing from the speakers. The built in speakers are not at fault- I did a self-diagnosis of the monitor and the speakers seem to work fine. I also set the monitor as the default for the audio and even when Win 7 says its functioning properly, the speakers still don't work. Conversely, setting the default to my PC speakers work fine. I'm sure its not the HDMI cable- I used the same one to hook my pc to my tv with everything working fine.
I am having an issue with my Aurora R4 computer. I am trying to play a game using my monitor, but also to have it show on my HD TV. I have the monitor hooked with a DVI connection and out from the same card to the HDMI in the TV.
I added speakers to my computer and now I have this message on my monitor "HDMI NO SIGNAL" Everything seems to be working etc. I can get on the Interenet, play viedeos, email and all software is useable. How do I get that message off of my monitor?
I've noticed this anomaly now on two of my Windows 7 rigs, one each Ultimate and Home Premium; Monitors that attach via HDMI are not getting the 'power off' message from the OS. The same monitors hooked up by other means [dvi, vga] do power off just as they should per the power plan settings.
My new graphics card has a HDMI port and I was wondering if there is any advantage of using that port to connect my graphics card to my monitor over the normal DVI port?
I have a Lenovo H405-7223-1GU that I bought in June 2011. Yesterday I bought a Philips 32" HDTV 720p 60Hz. My PC won't recognize my TV at a monitor via HDMI cable. The ports on the TV and the cable is fine because I can connect my DVD player with no issues. Most help is saying to right click desktop, click "screen resolution"and then just tell the PC it's ok to send info over HDMI. That would be fine but I can't have my monitor and HDMI plugged in at the same time. As soon as the HDMI is plugged in my monitor (VGA) goes blank then comes back on with only the background image showing. No start button, no icons, no toolbar
I connected my computer to my monitor via hdmi but there is no sound coming from my monitor. i updated all drivers and even tryed to connect the monitor to a reciever from the head phone jacks. i get sound from my computer to the reciever that way byt not thru the monitor. monitor is e2370 lg and my graphics card is ATI Radeon HD 5450.
I wanted a break from the HDMI sound through the monitor to use headphones, and after several other methods, I tried disabling the HDMI sound. That worked great, but now that option for playback device is completely gone! How do I get it back so I can use the (admittedly weak, but still functional) monitor sound?
I've just built a new computer with Windows 7 and a gigabyte brand radeon HD4890 and have been having the same issue as some other users with regard to the over/underscan.
I managed to correct the inch or so of unfilled space when displaying the desktop by using the under/overscan utility in CCC (thanks to info on this forum ^_^) but found that the problem still exists when I ran my first full-screen applications.
I installed, updated, and ran Spore: Galactic Adventures to take a look at how my system was going to handle everything, but found that the dreaded inch was back. The really odd thing was that when I switched Spore to windowed mode, the inch had gone and I could maximise to full-screen. Of course, the window header and taskbar kind of detract from the game, but this isn't necessarily an issue for Spore (which can hardly be described as immersive).
I am, however, concerned that this issue may pop up in other games where the window header and taskbar will detract from the game (if it operates well in windowed mode at all).
I plan on investigating further later today, possibly in the direction of motherboard and chipset drivers, but also testing with other games.
I have a Toshiba L755 Laptop with Intel Graphics and Media Control Panel. I hooked up the laptop via HDMI cable to a TV monitor. It worked great, but afterwards the screen will not fill up. I can't adjust the overscan and am stuck.
My 6970 xfx works great except when I connect the hdmi to my receiver. The monitor just shuts down. Is there a procedure I need to do to prevent it? When I disconnect the hdmi the monitor turns on...
Im interested in setting up a multi monitor setup.My main monitor will be for gaming, and the other will be for web browsing/server admin clients. Could i run one monitor off my GPU and the other off my Core i5's integrated GPU via the built in HDMI connector? or would this cause a confusion with windows? the reason i ask is because i cant see my ATI 6950 handeling both BF3 and another monitor,
he has a Dell Inspiron laptop Window 7 and a Samsung LED TV. I'm on port HDMI2 on the TV from the PC. I have tried everything but it says "source is not connected..." I tested the cable with the blue ray player and it works fine. I called Dell (still under warranty) an they were useless.I've checked the settings on the PC and I have it set to "extended" so I can see the monitor on both....no luck. Is it a setting on the TV that I am missing?
My old monitor has problems turning on, I can leave it on for 30hrs before it decides to display a picture, or I can manipulate it by unplugging the DVI cable from the back, but sometimes even that takes a few tries. It's off warranty and for reference it's a Samsung SyncMaster 226BW.So I bought a new monitor and I have it plugged in using HDMI, but a problem I get is that when both monitors are plugged in the HDMI has no display until it gets to Windows, so I cannot see the boot process of my PC, because my old monitor as explained above has problems turning on. Was a hindrance today when I was tinkering with my SSD and had to go to BIOS, and frankly I'd like to be able to see in case anything goes wrong, like a BSOD or Windows update etc.
ive got a problem with my new viewsonic vx2753mh-led monitor. its got 2x hdmi inputs and one vga.the vga works straight off the bat, but whenever I try and plug the hdmi in it just says no input signal detected and goes black I noticed in device manager that this was installed as 'generic non-pnp monitor' so after jumping through some hoops i managed to install the driver as 'viewsonic vx2753 SERIES' anyway this doesnt seem to have made a difference. still no input through the hdmi.I have tested to cord on my samsung tv and it worked fine so it can't be a problem with the cord or the graphics card....just some general info that might be of use:
windows 7 x64 dual monitor setup
the graphics is integrated on the intel i5 chip my display adapters in device manager says Intel(R) HD Graphics Family.it has a dvi out, hdmi out, and vga out.
I use my tosh laptop for looking at netflix on my two TV.One is a LG other is a Sammy. Since I have had the laptop about 4 to 6 weeks it has (HDMI out) worked on both tvs no problem! The last week The LG just will not recognise the signal Like the Samsung works ok no probs there... The LG do no matter what HDMI port I use nada (No signal is what it says) there are 4 HDMI ports and none are working, I have tried about 5 different HDMI cables and still nothing?
So I bought this kind of HDMI cabel I have a HDMI port on the backside of my case and my motherboard should support it.So one end going into the HDMI port on the case and the other one goes into the TV, but my computer is not recognizing it.I'm trying to dual screen, I have a normal PC monitor which is working fine, however the HDMI for the monitor looks like this and works just fine, what could the error be?
I am having issues with a brand new system which has an Intel mainbord. I am trying to connect the system to a Denon AVR1912 unit. Windows tells me that the Denon amp is attached to the HDMI socket (the Denon AVR1912 name is displayed on the icon) but then contradicts itself by saying that it is unplugged. I did discover at one stage that the HDMI icon in the "Sounds" section of Control Panel disappeared. I then had to reload the driver and thought that might fix the problem at the same time, but no such luck. I have checked to make sure that the HDMI socket is enabled in BIOS and that was OK.
This HP upgrade from Vista to Windows 7. I have video but no sound at all. I went to intel's website and downloaded the intel graphic drivers. But the sound drivers say un-recognized. In my device manager I see IDT High Definition Audio CODEC and Inter(R) High Definition Audio HDMI. My HP model is dv4-1465dx
My HDMI connection from my old laptops works but my new x64 does not want to work. I do see the chanel open on the tv when I plug in but dont see video or hear sound.Just a blank screen. Under sound setting, the HDMI option is there but It wont allow me to set it as the default
I have my Iiyama prolite connected via dvi and another TV via HDMI.As soon as the HDMI is plugged in I lose the DVI display and it transfers to the HDMI. The DVI is no longer detected.Does this mean if I want a second monitor I will have to set it up through VGA?
Recently I upgraded my vista to Windows 7 and tried connecting it to my HD TV through the HDMI but the screen on my laptop goes blank (black) and no reception on the TV as well. Have an ATI Radeon 4500 Series adapter and HDMI used to work perfectly in Vista.
when i connect acer laptop to tv using HDMI, I get no sound but the picture is fine. On the sound options I only see the Speakers and Digital Output options but dont see any HDMI Audio output. This used to be visible earlier (and the sound used o work) but this option suddenly vanished.
I have Acer Iconia W500. It is wokring great in all situation except one. It has one HDMI output. I can connect it to my 24" LCD tv with HDMI cable and works perfect (I can switch between single desktop, clone, extend etc). When I connect to my LG 42" LCD tv using HDMI, the screen goes blank and tv says no signal. When I connect to the TV and start the tablet, I can see starting Windows on both screens. Seems once Windows is loaded, I get no signal on the TV from the tablet. It did work the first time and I was messing with the display setting and I think I accidently turned off backlight on the tablet.I use logmein.com to remote into my other machines. I tried using logmein to connect to this, it works when the tablet is not connected to this TV, but once it it connected to the tv, I only see a blank screen. I also tried UltraVCN, I get same thing, blank screen.I need to know if there is any way I can either delete the LG tv profile or something where when I connect the HDMI cable, it will revert back to default setting and the video will show on both tablet and tv like it does when I connect the other TV.Sorry about long thread, but wanted to make sure I enter all the details of my problem. Below are the system details. 10.1 HD Multi-Touch LED-backlit TFT LCD Display: (1280 x 800) resolution, wide viewing angle AMD Dual-Core Processor C-50 2GB DDR3 Dual-Channel Memory Windows 7 Home Premium 32-bit ATI Radeon HD 6250 Graphics 32GB Solid State Drive 802.11b/g/n Wi-Fi CERTIFIED Bluetooth 3.0+HS Dolby Advanced Audio v2 Audio Enhancement High-Definition Audio Support 2- USB 2.0 Port 1- HDMI Port with HDCP Support
I bought new computer about 6 month ago. Hdmi worked fine with my tv and I don't know whats wrong now. I know its not the cable because I bought another one and the tv is certainly not because it has 2 hdmi. The computer as I said is almost new so a would be very happy if I wouldn't have to give it repairing.
Up until the other day, I've been using my MSI Wind u230 netbook / Radeon HD 3200 to output to my monitor (Acer S243HL - only has 1x DVI and 2x HDMI inputs) and then occasionally swap to a Westinghouse 32" TV.The other day: I was using my monitor then swapped to the tv, and then returned to my room to connect to the monitor and it just won't work. (To note, I've been using Win key + P to set projector only and to disconnect projector before unplugging the HDMI cable.)The netbook screen will turn itself off (as if it detects the HDMI-out) but the screen or TV will just show "No signal."I tried a shorter HDMI cable, tried the other input on the monitor, still nothing. Strangely enough, checking the displays in Catalyst or the Screen Resolution properties, it can detect both the TV and the monitor. There's just no signal coming out.Even stranger: trying to connect another laptop to the monitor (which used to work) displays "No signal" too.
I built a barebone kit and installed Windows 7 Ultimate 64bit, I installed all the drivers correctly but for some reason Windows keeps detecting the front audio jack as if someone is repeatedly pushing in and out headphones(no pun intended). Audio doesn't work at all through front and back audio jacks, only HDMI. I only get static from the audio jacks.