I have done some searching but to no avail. I have tried everything I can find/think of with absolutely no luck.
I have a brand new Samsung LN46B750 which I have connected to my desktop for use as an HTPC. VGA works fine, but I would much prefer to use a digital connection. SO, I have an HDMI <> DVI adapter and a monoprice HDMI cable connecting my GeForce 6800 to the HDMI/DVI port on the HDTV.
I have tried 182.5 and the latest Nvidia drivers as well as the generic Windows 7 drivers. It's all the same.
Everything works fine in the OS so long as I make it past the Windows splash screen. And unfortunately, that never happens. I believe that the problem lies in the auto adjustment of the screen when 7 starts loading. (If I boot without the TV connected, and then connect after the splash screen has loaded, everything is perfect.)
Is there any way to disable the auto detection/adjustment of the TV and just have the OS always boot into 1920x1080 @ 60 Hz. I am really tired of the blank or black screen every time I reboot. This is a terrible bug with Windows 7 or EDID and it needs to be fixed now Microsoft!
The problem I am having is I have connected my pc to my samsung tv The image and sound are showing up on the tv, however, I am getting an error message on the tv saying NO SIGNAL.
The problem I am having is I have connected my pc to my samsung tv... The image and sound are showing up on the tv, however, I am getting an error message on the tv saying NO SIGNAL.
I have connected my hdmi cable as usual but this time i had on tv display NO SIGNAL. When i connect another pc to check the cable it seems to work correctly.The problem is coming from my pc.
I just built a new gaming rig with 8gigs of RAM. I made the mistake of installing WinXP Pro x32 on it. I find out that 32 bit OS's can't access 8 gigs of RAM, so I went and bought an OEM Win7 64 bit disk. Come to find out now, Win7 x64 doesn't play well with many HDTV's. Something about they changed the instruction in Win7 to not allow for overriding something called the EDID signal from the HDTV. So now I'm screwed again. Now I got 8gigs of RAM but no video. Great. What now?
I need to go and buy a third OS that will work this time? WinXP Pro x64? Yeah, it's not the video card, the video card drivers, or the TV. It all worked just fine with WinXP Pro x32 installed. And all the cables are plugged in tightly. Win7 x64 will output video signal to a generic VGA monitor. If I set the resolution to like 1024 x 768 or something then swap the VGA cable from the VGA monitor to the HDTV I get video signal. As soon as I reboot, no video signal. I get video for the post screen and the Win7 startup screen.
As soon as it trys to boot into the desktop, no video signal. My AKAI TV only says "Not Support". Win7 will not get past the HDTV EDID handshake. Win7 sucks. Now I find myself looking at the Windows web site HDTV "compatibility list" for Win7. Like trying to work my life around bill gates and his crappy software. Serving Windows instead of it serving me. When will those people ever get it right again after WinXP? Any workaround that will make Win7 override the HDTV EDID handshake?
I just bought a Lenovo v570 laptop with vga and hdmi output and I cannot get the hdmi to output to my tv. The tv is a vizio 47" 3d LCD and the laptop is running windows 7 home 64bit. I have tried windows graphics drivers, lenovo drivers and intel drivers. To me there is not much difference between them but who knows. The laptop will output to tv using vga and it will connect to an LG 42 inch LCD at my work using hdmi, just not to the one I have at home. I have tried everything that I can think of as far as settings go and on my tv as well. Nothing changes whatever I do. Does anyone know of any hardware compatibility between some vizio tvs and windows 7? I have seen one on the compatibility website for windows but it was not my tv. Mine is not listed. The model number for my tv is E3D470VX. Not sure what else to try. Next I guess I will call Vizio to see if they can work out the problem. Lenovo support doesn't know.
how can i have 1366x768 resolution via the ordinary pc vga cable i have an hdmi port in the tv and the graphics card but the problem is i have an xbox 360 which is connected to the only HDMI port in the TV so the xbox is connected via hdmi and the pc is connected via VGA i have a samsung La22B350F2 TV?
I have a Dell 1525 with the intel 965 chipset graphics. I had XP and Vista 32bit installed previously and the graphics drivers would have an extra option to change the horizontal and vertical position of the screen. But now with Seven x64 and the drivers downloaded from Update, I can't seem to fix the problem.
Whenever I have a program maximized on my hdtv a few pixels width wise will be off screen. Just enough to have the menu bars and scroll bars off screen. Are there any third party programs to fix this? I tried installing x64 Vista drivers and they would invert and wash out the colors on my screen, totally unusable.
When I connect my PC to my TV Via HDMI Cable, my TV does not receive a signal. When I check in Control panel>Display>Resolution, My TV hasn't even been detected. My TV is an LG 50", Card is a GTX580, and my OS is Windows 7.
Also, when I connect my laptop to my TV via HDMI (Also on Windows 7), It is instantly recognised and automatically displays, so it has to be an issue with the PC itself.
My All in one PC Asus ET2400XVT has a HDMI-input port but whenever i try to connect it through a HDMI-HDMI cable to my ps3, it does not sense the signal. What can i do?
Sony notebook with HD AMD6630 M. HDMI port not giving signal to TV. When I rollback the Intel Gaphics driver the HDMI signal comes alive but then goes dead after reboot. I have been trying to solve this for 2 weeks without success.
I added speakers to my computer and now I have this message on my monitor "HDMI NO SIGNAL" Everything seems to be working etc. I can get on the Interenet, play viedeos, email and all software is useable. How do I get that message off of my monitor?
I am pretty decent with computers but not an expert. I am having a little trouble with my home theater PC display flickering. My video card is a Radeon HD 6900 and the drivers are up to date. I have the HDMI out of my PC going to an Onkyo 7.1ch HT-R590 receiver and it's output going to my LG 42" TV. My OS is Windows 7 Ultimate X64.The system used to work perfectly fine but over time the display started to flicker at random times. I tried to hook my PC directly to my TV and it works perfectly fine, so I am 100% certain the problem is in the receiver. I did a bit of research and it appears to be a problem with my receiver over heating.
Now, if I run my HDMI to my TV, I cannot use 7.1ch surround sound. (Even after playing with Audio Return Channel) If I run optical to the receiver, my video card only puts out 5.1ch surround sound for some weird reason. So I did the next obvious choice, I ran a DVI to HDMI to my TV for video and HDMI to my receiver for audio. I thought it would trick it into thinking there was two displays but this does not work, I do not get audio. was wondering if you guys knew of any utilities that would allow me to output audio on one HDMI stream and video on my DVI port. I did a lot of looking around and it looks like DMCA calls for HDMI to be encrypted and you can't separate the stream. Or would I just be better off getting a separate audio card to connect to my a/v receiver
ive got a problem with my new viewsonic vx2753mh-led monitor. its got 2x hdmi inputs and one vga.the vga works straight off the bat, but whenever I try and plug the hdmi in it just says no input signal detected and goes black I noticed in device manager that this was installed as 'generic non-pnp monitor' so after jumping through some hoops i managed to install the driver as 'viewsonic vx2753 SERIES' anyway this doesnt seem to have made a difference. still no input through the hdmi.I have tested to cord on my samsung tv and it worked fine so it can't be a problem with the cord or the graphics card....just some general info that might be of use:
windows 7 x64 dual monitor setup
the graphics is integrated on the intel i5 chip my display adapters in device manager says Intel(R) HD Graphics Family.it has a dvi out, hdmi out, and vga out.
Does any know a good slpitter for splitting a HDMI signal form a WMC computer to 2 tv's? I bought one from monoprice but I can only get one signal to work. I know you use a extener but I do not want spend $200 to $300 for one just to watch 1 or 2 hrs of programs.
The sound was working on my SAMSUNG and I was trying to temporarily disable it, so I would not distrub my housemates.... So I clicked "disable" on the control panel Sounds gaget, thinking this would temporarily turn it off, like it does within the Networks gaget (I turn off wireless at work, and network port at home).
Well it did not temporarily disable it, it deleted it from the system. I have managed to get it to see the HDMI TV at 1080p resolution, but sounds
don't work on the TV since I clicked disable.
I have tried every setting on the control panel, such as sound devices, graphics, displays, tried the "find a device" all to no avail. How do I enable the sound for the samsung?
I am connecting a P2450H monitor via the HDMI cable and getting a very poor, fuzzy, flickering picture on the monitor. The laptop is an HP4525s with HDMI output. The VGA output into the montor is crisp and clean. I have tried all the resolution and colour options, but acnnot resolve the issueI have the latest digital driver installed.
I've recently purchased a Samsung T23A750 monitor and to my disappointment when I try to watch a movie the built in speakers are not functioning. I have a Sony Vaio Z216GX connected via HDMI to the monitor (HDMI-DVI) port. I can definitely see the video, but no audio is playing from the speakers. The built in speakers are not at fault- I did a self-diagnosis of the monitor and the speakers seem to work fine. I also set the monitor as the default for the audio and even when Win 7 says its functioning properly, the speakers still don't work. Conversely, setting the default to my PC speakers work fine. I'm sure its not the HDMI cable- I used the same one to hook my pc to my tv with everything working fine.
I am having an issue of HDMI working during PC boot, but TV loses signal just before Windows 7 log in screen comes up (TV shows "No Signal" at this point). If I use VGA to connect PC to TV, everything works fine. [code] One thing I noticed is that Windows 7 device manager sees my TV as a "Generic PNP Monitor". I have no idea if this is connected to my issue or not. The Intel website says my drivers are up to date when I run their update utility. The Intel Graphics utility on my PC sees my TV as a monitor, not as a TV, but it does list the correct model # (M260MV).I've tried connecting both the TV with HDMI and a PC Monitor with the VGA cable and the computer would only detect the PC monitor with the VGA cable.I have read on an Intel site that others with the Core i3 CPU had similar problems so this may be a CPU/CPU driver issue, but as I said, the Intel Update Utility says my drivers are up to date.
I just bought a Netgear WNA3100 wireless usb adapter for my HP Win 7 64 bit laptop. I have been trying for 2 and half days to figure out why my speed is slow, why i don't have a good signal, and/ or if I do have a bit of a signal to my luck, why my signal comes and goes. I don't have internet at home at the moment because I am trying to save up to go to school. I am picking up a public business signal from a block away (which I have asked for permission to use), but I only am receiving not much of a signal with the Netgear usb wireless adapter.
I recently built a computer for my first time. It was running fine whilst playing Arma 2, dayZ and dawn of war 2. I then installed Star Wars The Old Republic and everything went on fine until I went to play the game. The screen went black and the no signal message came up on my tv.
I could not do anything so I just turned my computer off and then back on again. When I then went to boot again in Normal mode, it just shows the windows logo and then does the same thing.
I have tried booting it in safe mode, and this works fine. I don't think it is anything to do with my monitor as it works in safe mode and also it's my tv which works with DVDs and anything on TV.
Could it possibly be something to do with the drivers? Although I have been playing other games fine until now. Or could it possibly be something such as a corrupt file or the game installed something else which won't work?
Someone also suggested that it might be something to do with the power going into the computer? They said it could be putting too mcuh power into it and it would cut out.
i just recently built a new gaming rig and am running an SSD for boot up.
Everything turns out just fine except when after the windows logo appears, I can hear a little crackle noise from my headphone then the monitor loses signal and my g15 keyboard LCD turns off, then a few moments later my monitor and keyboard comes back to normal and i am a the login screen. It is not a big deal but am just puzzled at why it would do this.
It does slow down my boot speed by a lot especially for an SSD. Anyone know how i can speed this up? Someone told me disable USB 3.0 to speed things up. Any problems with Asrock MOBO drivers that could affect this?
So for my new pc build I have used a vga cable so far untill i got a hdmi, so now i have the hdmi and i turn the computer on it shows the screen untill windows has stopped loading (when the login screen appears) saying no signal
This is on a loptop Meldon E6882, running windows 7, to a LED TV HD. From a cold boot both the laptop monitor and HDMI screen show the windows starting screen .. But the HDMI screen goes blank when the login screen appears. I have tried different screen resolutions. No Luck. I have tried from the 'screen resolution' page to get automatically detect the HDMI - No luck I have tried from the 'screen resolution' page to manually select HDMI. Only offers VGA or component, no HDMI.
I think this maybe a disabled driver as I did a manual disabling of these some time ago in order to try and reduce the boot time. I cannot recall the names of the drivers I have put into manual start (some time ago), nor have I found any info on what drivers are needed for the HDMI. The screen driver is an Intel(R) HD Graphics 3000. I'd like to look at some video on the larger screen...
I'm trying to hook up my TV with my computer, and for the most part it's working. The video goes through and I get how to control it and everything. The only thing not working is the sound.
I'm using an HDMI to HDMI cable, and like I said the video is fine. Also, when the cable is plugged into my boyfriend's laptop, the sound comes out of the TV.
So this must mean the problem is with my computer specifically. Although, this computer is only a month old, so I wouldn't think anything is out of date.
Referring to this post [URL]. Just to make sure I'm understanding this correctly, Paragon should work on your computer when this issue is going on? I installed a new Samsung 830 SSD in my ASUS G74SX last night using Norton Ghost and while the clone worked as far as I can tell. I get the same blue screen and when I manually start explorer the drive letter for the SSD is E: rather than C: I think fixing that would remedy the issue but it won't let me change the drive path with the OS in use.