Graphic Cards :: HDMI Cable Not Displaying All Off Screen On TV On AMD Card
Mar 25, 2014
I have a AMD 5650 that shows a black bar that is 1 inch on the left and right and about .5 inches on the bottom on HDMI. I've used the drivers that Microsoft provided and also beta drivers from AMD from Febuary 2014, nothing seems to work. I've taken a picture, I've underlined where the back bars are in red.
i finally worked out how to get full screen on my tv when connected via hdmi from laptop, which is great. only now found that when i shut the laptop screen i lose the full screen on the tele and it sizes down to about 50% with the black boarder. on my laptop control panel "what does computer do when i shut the lid" i have selected "do nothing" on both wired and battery modes. How I can close the lid and keep full screen? (the tv is a celcus full hd smart something or other and laptop is hp running windows 8.1,
My wife has a nice new VAIO laptop. We connect it to a Samsung TV with an HDMI cable. Initially there were video artifacts on the TV screen, but after some fiddling they seemed to go away. Today they have returned. I have tried another HDMI cable and it didn't work.
The artifacts are only on the TV screen and not on the laptop display. They cannot be captured with a screenshot (I just tried this and am posting from my computer).
The web search picture (HDMI.jpg) shows small vertical lines in a row after some of the text.
Basically my screens kinda broken. 2 flies kept passing around me and I tried swatting it away with my hand, and when I did that my earphonescaught me hand and one of the buds hit my screen and this happened.. I took a picture of it to see what it looks like..
Display Adapter: Intel(R) HD Graphics (2000) Driver: Intel 126.96.36.19940
Playback Devices Show Disabled Devices - checked Show Disconnected Devices - checked
High Def Audio Device displays as not plugged in
Cable is plugged inCable has been tested on backup laptopCable connected to HDTVVideo is fineNo Audio
I have spent hours installing and uninstalling drivers from the Toshiba and Intel sites. Tech support said to return it but I have just spent 2 days installing software. I never took a break to watch Netflix.
How I can display my windows 8 session on a TV using smartglass.
As I understand it, smartglass is a competitor to Apple's AirPlay. On Airplay, I can sync it with a television and show the display from a mac or ipad on a big screen wirelessly. I want to do the same thing where I can sit on my couch with a laptop, but display the windows desktop on the TV screen.
I recently broke the screen to my asus n56vz. I ended up removing the whole top panel and removing the inverter cord.
I plugged in a external monitor and it worked fine at first. When i restarted i noticed that it starting running hotter and displayed a second option on a 768p monitor. My 1080 aoc was made the main display seen though its not the first monitor its second.
I tried to delete or disable the second monitor but i could not. What should i do?
I used to have 33c temps idle and not its 50c. It used to display only the external monitor. Now it displays 2 and the 2 is my main.
I dont have any recovery or 8.1 disks... I tried a refresh and it said "missing files" so i cant even do that...
When I hook my laptop up to the TV with a HDMI cable my laptop screen goes dark and the TV only shows the task bar and the desktop background. Sound works. No mouse or actual programs are displayed. I would like to mirror the TV to my desktop. My laptop is a HP Pavillion Dv7-4285dx running Windows 8 Pro 64 bit.
Well I have been having issues with my monitor. I had a Toshiba laptop I had hooked up to my monitor, everything worked great. It then crashed, I got a new Dell with Windows 8 64bit and Intel graphics. I have an HDMI to VGA adapter cause my laptop doesn't have a VGA port and monitor doesn't have HDMI. So I hook it up and get it to work but "Input Not Supported" was floating around the screen.
So I changed the resolution. The only resolution that makes the "Input Not Supported" to go away is 1600X200. Well its clear picture but I have the annoying black bars on the side of my screen. I have looked around and everyone says to go to the CCC but I can't find it on here and from what I understand that is for AMD only(if I'm mistaken how to make it work cause I have attempted it). So is there an Intel version or how do I fix it?
Last night I got hold of a 27" BenQ GL2750HM, so far so good. I have my main display from the DVI port to HDMI port on the screen and my secondary screen is my 60" TV which is connected through HDMI port on the graphic card into the HDMI port in my reciever.
The main scheme for my setup is like:
- Radeon 7970 DVI > HDMI Main monitor (BenQ GL2750) - Radeon 7970 HDMI > HDMI Reciever (ONKYO TX-NR609) - ONKYO TX-NR609 > 60" TV
So to the problem now. When the HDMI cable is connected to the graphic card, I need to have my TV ON in order for my main monitor, the BenQ, to work and if I turn off the TV the BenQ is getting black, like a standby mode, however, there is "no signal" to the monitor.
This was never a problem with my other screen, I could have the HDMI cable connected to the graphic card, the TV off and the reciever OFF without the main monitor went black.
I have an ASUS s550C running windows 8 and my HDMI to my 32" RCA flatscreen doesn't work anymore. The cable works fine with the xbox, correct tv modes, but now as soon as I plug in the HDMI to the Ultrabook the Ultrabooks screen goes black and the tv screen says unsupported. I have tried extended, 2nd screen only, and all screen settings, 720/1080I etc.
I recently update my Dell Inspiron 15r with an inbuilt ATi Radeon HD 8730M to Windows 8.1. Unfortunately, after the update only the Intel HD card worked. I re-installed the drivers from the Dell website but they did little to solve the problem except fix CCC. Games such as Assassin's Creed Revelations which used to run at 60fps at maxed out settings now stutter at medium to low. I bought this laptop because of its dedicated card and it is a shame it is not working. I have the Windows 8 re-install disk supplied by Dell but I would like to fix this problem rather than shy away from it.
Running win 8 and installed Windows 8.1 and device manager told me windows shut down my video card (560 Ti) because of a problem and began running default drivers. I tried everything from removing and reinstalling drivers, reseating the video card, changing slots, etc....
As soon as I reverted back to Windows 8, EVERYTHING ran as fine as before.
AMD FX-9370 32gb G skill ASUS M5A99X EVO Windows 8
I have windows 8 on a Dell LATITUDE D620 everything works fine except that what my graphics card is i have tried everything to find out what it is including taking off the back of the laptop and what i think is the graphics card but i dont know it says on it Hylinx Made in korea. i need to update my graphics card drivers in order to play a game that i have but i need a way to download the right driver from the maker's website but i dont know who the maker is
I am having is my Windows 8 x64 has tons of artifacts and the mouse leaves trails that's even if it lets me into windows... however when I use the same setup on my Windows 8 x86 there are no artifacts...oh and same on Windows 7 Ultimate x64. I am aware that there are still some known issues with Nvidia Chips on x64 versions of Windows 8.
I have tried several different things and nothing has worked so far...
my graphics card, Nvidia GT650 is not ready for UEFI installation and I have no way to flash the bios. If I boot using legacy bios, install the card, then can I flash the card BIOS and then return to UEFI booting?
The card is currently not installed and I have no other system to use to flash the cards bios.
So I recently bought a new computer. I was using the GTS 450 on the old one no problem not even a month ago. I have attempted to install it on my new pc. Switched out to a Thermaltake TR2 600W PSU which is operating just fine. I placed the video card into the primary pci slot and powered it with the 6-pin connector. Put everything back together and started her up. PC turns on fine, with even the fan going on the video card. But no video output. I'm stuck. What can I do? The drivers won't install themselves without the hardware.
As the title says, I've tried connecting my TV to my computer with an HDMI cable, and the video part works, but no sound plays. When I checked the list of playback devices, the TV appeared (as shown below), but it apparently cannot play sound.
I don't think I know anything beyond what I've posted,
Also, as a side note - I am completely unable to interact with any of these objects - I cannot set something as a default device; there is no response when I do it, and disabling the audio device only works until current sources of sound (like videos) are restarted.
I sometimes plug in my LCD TV via HDMI cable to my laptop. I use it as a second monitor. The sound used to go to the TV as well. However, I inadvertently deleted the sound "profile" for the external monitor and I can't get it back. (I've attached a screen shot showing where the other profiles are, but missing the external monitor one).
How do I restore it so that when I plug in the HDMI cable the sound will go to the external monitor?
I am on my work computer, a DELL desktop with Windows 8.1, and I've been using a dual monitor setup for months, both with Windows 8 and 8.1. One monitor is connected via the VGA input, and the other is through a VGA to HDMI adapter. Again, they've been working flawlessly for months. It was a plug 'n play setup that took a minute and never had an issue.
I came in today and the second screen was blank. In the screen setup menu it recognized it and allowed me to adjust between extending, mirroring, etc. But after I unplugged it and plugged it back in, it no longer recognizes it. It simply cannot find a second monitor plugged in. Both screens work when plugged directly into the VGA port separately but not when plugged in the way they've worked for months.
I just purchased a new windows 8 PC. I was attempting to hook the computer up to my tv through hdmi (the pc has no VGA input) and when I went through the normal paces to put the display on my tv (display desktop only on 2), the signal did not transfer and there is no display on either my tv or pc! how to revert the display back to my pc monitor without being able to see anything? Acer support was less than useful. It's an acer aspire z3-605 all-in-one pc.
I have a gateway NE56R41u. With that aside, today i played the sims 3 when it has a warning about my graphics card or whatever, so i went to my screen resolution to see if i can fix the problem. Unfortunately, the place where i can choose what my resolution is greyed out, so i cannot change a thing. Heres a picture .....
I installed windows 8 on my pc and it doesn't use the whole screen. There's a margin all the way around the screen. I have a 23" 1080p display and AMD Radeon 7450 graphics card.Also, OpenGL is not working at all, for anything.
I have an HP media center M7360N desktop with a Pentium D dual core, 4gb ram and a Ge Force 6200 SE card running 188.8.131.524 drivers. I get this screen mess sometimes when on the web and on the desktop.