Graphic Cards :: ASUS S550C Running Windows 8 - HDMI Output Not Working
Oct 9, 2013
I have an ASUS s550C running windows 8 and my HDMI to my 32" RCA flatscreen doesn't work anymore. The cable works fine with the xbox, correct tv modes, but now as soon as I plug in the HDMI to the Ultrabook the Ultrabooks screen goes black and the tv screen says unsupported. I have tried extended, 2nd screen only, and all screen settings, 720/1080I etc.
So I recently bought a new computer. I was using the GTS 450 on the old one no problem not even a month ago. I have attempted to install it on my new pc. Switched out to a Thermaltake TR2 600W PSU which is operating just fine. I placed the video card into the primary pci slot and powered it with the 6-pin connector. Put everything back together and started her up. PC turns on fine, with even the fan going on the video card. But no video output. I'm stuck. What can I do? The drivers won't install themselves without the hardware.
-Running my PC with a Radeon HD 7700 on Windows 8, main display is monitor in DVI port, secondary display is HDMI running to my LG TV for playing videos (only enabled some of the time in AMD Catalyst Control Center under "Desktop Management>Creating and Arranging Desktops")
-Normally, I enable the 2nd desktop (my TV) through the HDMI output of my video card easily. However, about a week ago I ran into an issue: while the OS detected the display, it only recognized it as a monitor in 640x480 resolution.
-I tried several things to fix the display issue, but was finally successful by changing the HDMI cable connected to the TV side to the second HDMI port on the back of the TV (that is, going from HDMI 1 to HDMI 2). This fixed the issue immediately.
-Good till yesterday. Now the HDMI 2 is no longer detected as a valid display. HDMI 1 (the original connected port) is still detectable, but it's still having the 640x480 max resolution issue (and no audio output as well).
-HDMI 3 on the TV currently works, but that is the final HDMI port on the TV, and I don't want to continue moving the cable without resolving the issue. I have tested the TV HDMI 1 and HDMI 2 ports with other hardware, and they seem to be working without fault with a DVD player or Media box, so I doubt the issue is surrounding my TV.
i finally worked out how to get full screen on my tv when connected via hdmi from laptop, which is great. only now found that when i shut the laptop screen i lose the full screen on the tele and it sizes down to about 50% with the black boarder. on my laptop control panel "what does computer do when i shut the lid" i have selected "do nothing" on both wired and battery modes. How I can close the lid and keep full screen? (the tv is a celcus full hd smart something or other and laptop is hp running windows 8.1,
When I hook my laptop up to the TV with a HDMI cable my laptop screen goes dark and the TV only shows the task bar and the desktop background. Sound works. No mouse or actual programs are displayed. I would like to mirror the TV to my desktop. My laptop is a HP Pavillion Dv7-4285dx running Windows 8 Pro 64 bit.
I have a AMD 5650 that shows a black bar that is 1 inch on the left and right and about .5 inches on the bottom on HDMI. I've used the drivers that Microsoft provided and also beta drivers from AMD from Febuary 2014, nothing seems to work. I've taken a picture, I've underlined where the back bars are in red.
Well I have been having issues with my monitor. I had a Toshiba laptop I had hooked up to my monitor, everything worked great. It then crashed, I got a new Dell with Windows 8 64bit and Intel graphics. I have an HDMI to VGA adapter cause my laptop doesn't have a VGA port and monitor doesn't have HDMI. So I hook it up and get it to work but "Input Not Supported" was floating around the screen.
So I changed the resolution. The only resolution that makes the "Input Not Supported" to go away is 1600X200. Well its clear picture but I have the annoying black bars on the side of my screen. I have looked around and everyone says to go to the CCC but I can't find it on here and from what I understand that is for AMD only(if I'm mistaken how to make it work cause I have attempted it). So is there an Intel version or how do I fix it?
My wife has a nice new VAIO laptop. We connect it to a Samsung TV with an HDMI cable. Initially there were video artifacts on the TV screen, but after some fiddling they seemed to go away. Today they have returned. I have tried another HDMI cable and it didn't work.
The artifacts are only on the TV screen and not on the laptop display. They cannot be captured with a screenshot (I just tried this and am posting from my computer).
The web search picture (HDMI.jpg) shows small vertical lines in a row after some of the text.
Last night I got hold of a 27" BenQ GL2750HM, so far so good. I have my main display from the DVI port to HDMI port on the screen and my secondary screen is my 60" TV which is connected through HDMI port on the graphic card into the HDMI port in my reciever.
The main scheme for my setup is like:
- Radeon 7970 DVI > HDMI Main monitor (BenQ GL2750) - Radeon 7970 HDMI > HDMI Reciever (ONKYO TX-NR609) - ONKYO TX-NR609 > 60" TV
So to the problem now. When the HDMI cable is connected to the graphic card, I need to have my TV ON in order for my main monitor, the BenQ, to work and if I turn off the TV the BenQ is getting black, like a standby mode, however, there is "no signal" to the monitor.
This was never a problem with my other screen, I could have the HDMI cable connected to the graphic card, the TV off and the reciever OFF without the main monitor went black.
I've got a bit of a strange issue that I haven't had with previous OSes on this PC. I'm running a Dell OptiPlex 380 with Win 8 x64 at the office and I have to run 2 video cards for all of my screens. I have a GeForce 8400GS in the PCI-E slot and a GeForce 6200 in the PCI slot.
Previous to the Win 8 install, it worked fine. Booted exactly as I had shut it off. Now, with 8 - it's killing my PCI-E card presumably when it puts the screens to sleep when I leave for the day. So I'm coming in to all of my windows being moved to the PCI card's monitor (only one hooked to it) and I have to reboot to get PCI-E card to come back.
If I check the screen properties, I see only the screen hooked to the PCI card. I've changed the setting in the BIOS for the first video card to Auto and also tried the PCI-E selection and it's the same either way.
Why on earth would it default to the PCI card rather than the PCI-E card?
I successfully got bootcamp to work on my iMac, and the only issue that I have run across is that I do not have any sound whatsoever coming out of my speakers.
The sound icon just remains at the speaker with a red cross, and no matter what I do, I just can't get it to work.
In Device Manager, the sound device that is being used is 'AMD High Definition Audio Device', and I can't seem to find an update for it anywhere. Under the sound area of control panel, if I go to the playback tab it says that the only available playback device is 'AMD HDMI Output', and it says that is "not plugged in".
I've bought an HDMI cable to connect my laptop to TV, i've gone into the control panel and checked that "show disabled devices" and "show disconnected devices" are both allowed and still i dont seem to get an option for HDMI for audio. The picture has come through fine, but nothing I do with the audio seems to work.
After upgrading from Win7 (x64) to Windows 8.1 Professional N (x64) my system started to freeze randomly. It seems to be a driver related issue, since the freezes do not occur, when I disable the video card in the device manager. I recently upgraded to the new beta driver (v14.1) which made it even worse.
I contacted the ASUS-support already. They said the card is broken - which does sound strange to me, since it worked fine using WIN7.
I'm currently running Windows 8.1 Update with an NVidia 660TI with two DVI monitors at 1920x1200 and a 1080p TV from the HDMI output. I'm having trouble with the 1080p TV in that when I drag an application (Store or Win32) onto the screen, the top and bottom is truncated when I maximise it. For example, if I drag Windows Media Player over to the TV and maximise it, the play, forward and rewind buttons when visible are cut in half at the bottom of the screen.
In various Games such as Dungeon Defenders, and the newly released Unturned; I receive drops in FPS for a almost set amount of time 10-15 seconds. Drivers are up to date. I've done several types of malware scans, and DxDiag brings up no problems.
I go from 60 FPS or higher, to (depending on game) 8 FPS-20 FPS. Shortly after the drop, it goes right back up to 60 or higher FPS. and does the same intermittent drop for a short time. Temps do not exceed 60C, if they even get that close.
Really getting to me, as there are no apparent reasons for this to be occurring. The speccy screenshot below is while a game is running.
I'm trying to use an external Samsung TV/monitor (model LN-T2642H) through HDMI. The native resolution of the monitor/TV is 1360X768, which is also the native resolution of ASUS T100.
If I try to use the monitor/TV as the only display, the device selected is Generic PnP Monitor and the native resolution of 1360X768 is not available for selection. Intel HD Graphics Control Panel only shows 1280X720 (selected) and 1920X1080 (too high for my monitor).
Is there a way to set up the 1360X768 resolution with the existing driver (up to date)? Is there another driver that I can download and that will allow this resolution? Can I use a different version of the Generic PnP Monitor that allows 1360X768 resolution?
With the only resolution available i.e., 1280X720 the quality is bad.
I have 3 monitors, BenQ RL2450, two of them via DVI and one via HDMI.
Issue, HDMI connected monitor goes black every now and then. Not really a pattern in it. Sometimes it can flicker black once every 5 min, sometimes once every 30 minutes. Monitors are ok, checked with other computers, cables are OK, tested on different monitors and HDMI on TV.
I am suspecting drivers, so did a complete uninstall with graphic removal tool, then installed latest WHQL and beta drivers, still the same. Seen several users with the same issue, but no solution.
Graphic card keeps 40-50 Celsius while gaming, so that's not the issue.
Just can't remember back when I did not have this issue, hasn't been there forever.
All monitors on 60Hz, and with the same resolution.
I have been getting these errors in Event Viewer intermittently, every 1-2 weeks since upgrading to Windows 8.1 . I have an HP m6-1158 (specifications linked).
I had to get rid of my last computer because of constant BSODs because I changed out the graphics drivers so often. That's why I'm reluctant to do it again. A friend advised me to do it only if there's a problem, otherwise leave them alone. If I do update them, he said, it's best to go through the computer manufacturer. Last time when my computer failed, I installed drivers from AMD, so I have been avoiding theirs and Intel's websites.
With this system, that I've had since May of last year, I haven't been able to update the integrated Intel HD Graphics 4000 with this Windows 8 driver or this Windows 8.1 driver. Each time I tried, I got an error that my computer does not meet the necessary specifications. Therefore I left it alone and the only driver installs recorded in Device Manager for it are from when I bought the system in May 2013 and the 8.1 update in October.
The Radeon driver is from 2012 and came with my computer. According to Device Manager, it reconfigured or reinstalled for 8.1 but the driver version has not changed. I have not had any major issues with gaming performance. I tested Far Cry 3, which runs great.
I recently installed the game Papers. After constant crashes on startup, I contacted support, who advised me to change the AMD switchable graphics options for the game to High Performance. After that, it runs fine. That leads me to think that there is an issue with the Intel driver.
I used to connect my Asus X72JR on my Flatscreen TV at 1080p, it always had a crispy sharp image.
After replacing my harddrive and reinstalling Windows 7 (64x) the image wasn't displayed in full size of my TV anymore but smaller (black edges around image) When i set my TV to full screensize everything gets blurry (it just spreads the image)
I updated to the latest graphic driver and looked at the screens resolution which was still set at 1920x1080 (recommended), everything was correct but the screen is either blurry or displayed smaller than the tv screen.
My Asus has a Radeon Mobility HD5470 driver
I have a second laptop, when I connect this to the TV is has a sharp image again even though the settings are exactly the same.
I recently broke the screen to my asus n56vz. I ended up removing the whole top panel and removing the inverter cord.
I plugged in a external monitor and it worked fine at first. When i restarted i noticed that it starting running hotter and displayed a second option on a 768p monitor. My 1080 aoc was made the main display seen though its not the first monitor its second.
I tried to delete or disable the second monitor but i could not. What should i do?
I used to have 33c temps idle and not its 50c. It used to display only the external monitor. Now it displays 2 and the 2 is my main.
I dont have any recovery or 8.1 disks... I tried a refresh and it said "missing files" so i cant even do that...
I am on my work computer, a DELL desktop with Windows 8.1, and I've been using a dual monitor setup for months, both with Windows 8 and 8.1. One monitor is connected via the VGA input, and the other is through a VGA to HDMI adapter. Again, they've been working flawlessly for months. It was a plug 'n play setup that took a minute and never had an issue.
I came in today and the second screen was blank. In the screen setup menu it recognized it and allowed me to adjust between extending, mirroring, etc. But after I unplugged it and plugged it back in, it no longer recognizes it. It simply cannot find a second monitor plugged in. Both screens work when plugged directly into the VGA port separately but not when plugged in the way they've worked for months.
I had opened up GeForce Experience and automatically updated my graphic drivers.The updater crashed while updating.Tried to update again and it gave me the message saying my driver was up to date no need to update.Opened up Steam and tried to play a few games and they all crashed. I also noticed games outside steam were very sluggish.I checked device manager and only my onboard video card was shown. My high performance card was no longer listed under display.
So I uninstalled GeForce experience and the driver then reinstalled an earlier driver version (I could just install the previous version and it should 'roll back').Same problem, my games would crash and things were slow.Removed those drivers and reinstalled earlier ones.Rinse and repeat removing and installing different version drivers 3 times. After being advised not to use GeForce Experience to update my drivers and to go to the manufacturer website I did so. I removed my current video driver and installed the one I downloaded from manufacturer.
Throughout this whole event certain drivers make my video card not appear in device manager and require me to install a different driver on it to get the card to display. Everything worked 100% fine until GeForce Experience crashed on me.
Currently; my video card is shown in device manager the driver seems to be installed and my games do in fact work. The problem is the FPS rate is extremely low. Before this issue I was having around 240 FPS inside certain steam games which I'm now getting 80. Before I updated the graphics driver my entire laptop worked extremely well and I've yet to have any issues. With that being said; I believe that if I could completely uninstall the graphic driver and reinstall the one I had before all this mess occurred things would be back to norm.
How to figure out what version I used to have and I'm not so sure I'm uninstalling the driver completely. (I'm in device manager just selecting uninstall) I'm willing to roll back my laptop a few days but I do not have restore point created. Maybe the driver is not installed correctly?
A different way to uninstall the driver or maybe a way to freshly install the video card, I don't know. Any simple way to restore my laptop to the way it was when I got it I'm willing to do that as a last resort. Everything I have is backed up.
System Specs: OS. Win 8.1 CPU. i7-4700MQ GPU. GeForce GTX 780m Ram. 16gb Make / Model Dell - Alienware 17.
I am experiencing some minor issues with my Graphics Card. Let me say first that this is an old graphics card, which was however updated to Windows 8. Today, when I updated the computer to 8.1, I came across a "Outdated Driver" error.
On this computer, I have a ATI (AMD) Radeon 2400HD Graphics. I did do a DxDiag and it is available here.
I recently update my Dell Inspiron 15r with an inbuilt ATi Radeon HD 8730M to Windows 8.1. Unfortunately, after the update only the Intel HD card worked. I re-installed the drivers from the Dell website but they did little to solve the problem except fix CCC. Games such as Assassin's Creed Revelations which used to run at 60fps at maxed out settings now stutter at medium to low. I bought this laptop because of its dedicated card and it is a shame it is not working. I have the Windows 8 re-install disk supplied by Dell but I would like to fix this problem rather than shy away from it.