Graphic Cards :: Monitor With Black Bars - HDMI To VGA Adapter
Sep 5, 2013
Well I have been having issues with my monitor. I had a Toshiba laptop I had hooked up to my monitor, everything worked great. It then crashed, I got a new Dell with Windows 8 64bit and Intel graphics. I have an HDMI to VGA adapter cause my laptop doesn't have a VGA port and monitor doesn't have HDMI. So I hook it up and get it to work but "Input Not Supported" was floating around the screen.
So I changed the resolution. The only resolution that makes the "Input Not Supported" to go away is 1600X200. Well its clear picture but I have the annoying black bars on the side of my screen. I have looked around and everyone says to go to the CCC but I can't find it on here and from what I understand that is for AMD only(if I'm mistaken how to make it work cause I have attempted it). So is there an Intel version or how do I fix it?
I have 3 monitors, BenQ RL2450, two of them via DVI and one via HDMI.
Issue, HDMI connected monitor goes black every now and then. Not really a pattern in it. Sometimes it can flicker black once every 5 min, sometimes once every 30 minutes. Monitors are ok, checked with other computers, cables are OK, tested on different monitors and HDMI on TV.
I am suspecting drivers, so did a complete uninstall with graphic removal tool, then installed latest WHQL and beta drivers, still the same. Seen several users with the same issue, but no solution.
Graphic card keeps 40-50 Celsius while gaming, so that's not the issue.
Just can't remember back when I did not have this issue, hasn't been there forever.
All monitors on 60Hz, and with the same resolution.
i installed windows 8 on my new pc who includes Shappire 7870 xt boost and monitor Samsung T220. whether i turned off the monitor manually or it turns off automatic by power saver, sometimes the monitor shows only black screen. The monitor does turns on, it's light turn on, but all i get is black screen.
i also have I5 4570 and Gigabyte GA-B85M-D3H.after i installed windows, i installed the motherboard drivers, includes Intel HD Graphics 4600. Could it conflicts with the ATI's driver?
also, when i connect the monitor to the motherboard it works fine (so far at least). but games doesn't run smoothly.
i finally worked out how to get full screen on my tv when connected via hdmi from laptop, which is great. only now found that when i shut the laptop screen i lose the full screen on the tele and it sizes down to about 50% with the black boarder. on my laptop control panel "what does computer do when i shut the lid" i have selected "do nothing" on both wired and battery modes. How I can close the lid and keep full screen? (the tv is a celcus full hd smart something or other and laptop is hp running windows 8.1,
this doesnt work--i am trying to add it as a second card so i can add an additional monitor
the device manger shows microsoft basic display adapter with a warning triangle in the properties it says windows cannot load the drivers for this device.
i tried that windows 9-7 vista ccc thing--it doesnt detect any of my cards-my other card is hd 4400
When I hook my laptop up to the TV with a HDMI cable my laptop screen goes dark and the TV only shows the task bar and the desktop background. Sound works. No mouse or actual programs are displayed. I would like to mirror the TV to my desktop. My laptop is a HP Pavillion Dv7-4285dx running Windows 8 Pro 64 bit.
I have a AMD 5650 that shows a black bar that is 1 inch on the left and right and about .5 inches on the bottom on HDMI. I've used the drivers that Microsoft provided and also beta drivers from AMD from Febuary 2014, nothing seems to work. I've taken a picture, I've underlined where the back bars are in red.
My wife has a nice new VAIO laptop. We connect it to a Samsung TV with an HDMI cable. Initially there were video artifacts on the TV screen, but after some fiddling they seemed to go away. Today they have returned. I have tried another HDMI cable and it didn't work.
The artifacts are only on the TV screen and not on the laptop display. They cannot be captured with a screenshot (I just tried this and am posting from my computer).
The web search picture (HDMI.jpg) shows small vertical lines in a row after some of the text.
Last night I got hold of a 27" BenQ GL2750HM, so far so good. I have my main display from the DVI port to HDMI port on the screen and my secondary screen is my 60" TV which is connected through HDMI port on the graphic card into the HDMI port in my reciever.
The main scheme for my setup is like:
- Radeon 7970 DVI > HDMI Main monitor (BenQ GL2750) - Radeon 7970 HDMI > HDMI Reciever (ONKYO TX-NR609) - ONKYO TX-NR609 > 60" TV
So to the problem now. When the HDMI cable is connected to the graphic card, I need to have my TV ON in order for my main monitor, the BenQ, to work and if I turn off the TV the BenQ is getting black, like a standby mode, however, there is "no signal" to the monitor.
This was never a problem with my other screen, I could have the HDMI cable connected to the graphic card, the TV off and the reciever OFF without the main monitor went black.
When I try to change my video adapter the only option I get is Microsoft Basic Display Adapter.
I have a Nvidia Geforce Go 7400
Specs:
Operating System: Windows 8 Pro 32-bit System Manufacturer: Dell Inc. System Model: Latitude D820 BIOS: Phoenix ROM BIOS PLUS Version 1.10 A10 Processor: Intel(R) Core(TM)2 CPU T7400 @ 2.16GHz (2 CPUs), ~2.2GHz
I have an ASUS s550C running windows 8 and my HDMI to my 32" RCA flatscreen doesn't work anymore. The cable works fine with the xbox, correct tv modes, but now as soon as I plug in the HDMI to the Ultrabook the Ultrabooks screen goes black and the tv screen says unsupported. I have tried extended, 2nd screen only, and all screen settings, 720/1080I etc.
Is there any way I can output the content of my samsung galaxy note 2 to my dell inspiron 7520 laptop via hdmi or vga socket? From my research I found that only the alienware laptops have hdmi input.
Last night my laptop (running Windows 8 x64 with a secondary monitor) was working fine. I put it to sleep and when it woke up this morning the secondary monitor is flickering a lot like this:
I have tried uninstalling the AMD driver and restarting then installing the latest version but it still doesn't work. The screen even flickered when the AMD driver was not present. I connected the monitor to another laptop and it worked fine.I played about with the refresh rate in the catalyst control center and it ended up making the laptop screen flicker instead so I reverted the settings back.
I am running Windows 8 on a Samsung 350V5C laptop 6GB RAM.
I'm struggling connecting to a second monitor, the problem being my laptop connects OK VGA to VGA but doesn't stay connected. After 5 seconds i get the disconnected 'beep beep' then it immediately reconnects and again after 5 seconds disconnects again. I did get it to stay connected the other day so i know it works but is was pure luck and just trial and error, however, when i went to connect again i got the same problem, also when it did stay connected i just got 'duplicate screen' when i just want 'Second Monitor Only'.
Here's what I did: Control Panel > Hardware & Sound > Display > Project to a second screen, then i have tried all 4 options PC only/Extend/Duplicate/Second only and i get the disconnect then reconnect every 5 seconds again.
Do I have to change any other settings, either on monitor or laptop?
Have a 32" HD TV which I use as my Primary Monitor(connected using HDMI) and a Lenova 17" as a Secondary (connected with vga cable). I shut the desktop down, unplugged the 17" to use on another pc. When I reconnect the 17" to my main desktop it now does not pick up
I have seen some posts on internet, concerning connecting an external CRT Monitor with a Laptop computer. They pretty much said that there might be a blue cable with, and one can use it to connect that monitor with laptop. Now I am confused that mine monitor has two cables; one of which is blue and the other one is white, having three larger pins inside. Both these wires are connected to my desktop CPU. Now my question is that, it is obvious that the blue wire can be connected to my laptop but what about the other one, this wire is connected to the power supply of my desktop pc, and when it comes to laptop, would mere one wire (blue) suffice for the electricity or other things?
My friend's new Sony laptop has an external monitor connected via HDMI. When he restarts the computer, about half the time the external monitor is not getting a signal, it is blank. He then restarts it until it works. I've only connected remotely to it, and I've been able to get it to go on by playing with the settings in Display properties. Possibly if I was in front of the computer I could try Fn + F5 to see if that works, but he is the kind of guy who does not want to press keys to make it work, he wants it to just work. What might be done to make the second monitor get a signal on a more consistent basis?
I have three monitors setup in Windows 8. Two of them to a Radeon 4670 and the third to the onboard HD graphics 4000. They work quite well in Windows 8. When I upgraded to 8.1 the HD 4000 monitor no longer comes up. The adapter is recognized and device mangler says it is working and has the most recent driver.
This is important to my work flow and is just one of a number of things not to love about 8.1. I would like to get this functioning before I tackle less important issues. Win 8.1 is installed on a test drive.
I am running Windows 8.1 preview and whenever I play certain games my monitor locks up and my GPU fans stop and the only way for it to work is by restarting my PC. I don't think my card is defective because I was able to run the game once until it crashed. I am running on 32 bit if that makes any difference. And I don't believe it overheats because I keep MSI Afterburner running the whole time. This also happened on Windows 7 32 bit.
I have a laptop with one VGA port but I wanted to connect it to two extra monitors so that I could create a triple monitor system. I bought a VGA splitter to be able to connect all monitors but what I discovered is that it still recognizes it as a second monitor. I have the screen on my laptop extended with the second one but I cant extend the second one to the third one. It just duplicates it. Is there any way I can download software to make it recognize the third monitor?
I bought a new laptop without OS, it's a XMG advanced with i7 4700MQ CPU, an Intel HD grafics 4600 chip and an NVIDIA geforce 765M GTX.
I installed win 8.1 and updated all drivers. The laptop itself works fine. Now I wanted to connect an external Monitor ( HP w2207h) The laptop recognizes that there is an external Monitor and that it is an HP w2207h and tries to extend the desktop to it, but the monitor itself says that there is no input signal via the HDMI input and it goes to sleepmode.
When I connect my old Laptop to it, it is no Problem, the monitor shows the extendes Desktop, so the monitor an all cables are OK. (old Laptop : Win7 core 2 duo CPU geforce 9600M GT )
With the new Laptop I have the same problem, when I try to connect it to my LG TV set via HDMI, the Laptop recognizes the TV and tries to extend the desktop ( I can move the mouse out of the internal display to the TV ) but the TV says there is no input signal.
I have set the Output to 60 Hz and different resolutions and tried to duplicate the display or only use the external one. Nothing worked.
I can't force the Laptop to use the Gforce, he decides which one he wants to use.
A friend bought the same laptop and made the same installation, but when he connects to his external moniotr everything is fine, but I don't know what kind of monitor he has and I have not connected my laptop to his monitor.
I power up the PC and monitor, i can see the POST on the monitor, and the windows 8 loading, then my monitor starts to shut down for 1-2 seconds, its not a complete shut down, like you change resolution or refresh rate, then after 1-2 seconds i can use my PC normally.
I uninstalled my nvidia drivers, and i don't have anymore this problem....
I booted windows 8 in safe mode, and i don't have the problem....
My gpu is a gtx680 and a XL2411T monitor, with dual link DVI cable...
I have a new Dell XPS with Windows 8. I was just starting to use it when I tried to explore bringing up a second monitor. For some reason the system doesn't recognize the second monitor. I then accidentally selected "second monitor only".
now I am without any monitor. When the system boots it recognizes the primary monitor (DVI) but then goes off to look for the second monitor. My system has AMD DVi video card as the primary video. The other video port is VGA, but I cannot get any monitor I have to work from it.
Is there a way to reset the video selection without having to use the mouse to navigate to the Devices selection?
I was having some problems with my dual monitor setup and while messing around with the settings I am afraid I messed it up a lot worse than my previous problems were. To keep it short, one of my monitors was not popping up (it ended up being the cable), but when I was in the settings I clicked the monitor to pick a "unknown device" as it was never detected. Now, when my computer boots up the main screen is the "unknown device" and the secondary screen is one of my two monitors. Because of this my screen only launches up as a blank gray screen. Typically one of my screens show this, while the primary one gives me the option to put in my password to login. Because this monitor which is my main monitor does not exist, I cannot get to my login screen. I know I am still on dual monitors because it will let me run the cursor off the screen as it does when I am using my monitors.
Basically, I want to boot up not on extended and only on my monitor (I tried the Win+P but it doesn't work on the gray screen). I also tried booting up in safe mode, but was unable to access it with Shift F8. Any simple solution to this before I have to do a reinstall of Windows 8.
My dual monitor system has been working fine since new several months ago. The other day, I started experiencing a problem upon startup or reboot: The left of my dual monitors would be blank (no picture). After switching the monitor off and back on again (sometimes more than once), the picture would appear. No further problems during session. At next reboot or startup the blank screen recurred. It is not consistently blank every time. and It almost always now takes two times of switching monitor off and on again for image to appear.
I suspected a bad monitor so i switched the cables on my AMD Radeon HD 7700 graphics card. The left monitor is still the one where the problem occurs. So it is not a bad monitor.
I have tried deleting devices, reinstalling drivers, to no avail. I cannot find any options in Windows 8 for additonal options after trying everything under options/display.
I currently have a 15" PC with a 3200x1800 resolution. Using Windows 8.1. I have the scaling to highest.
My 2nd external monitor is 1920 x 1200.
I am trying to extend my desktop to the 2nd monitor but is there a way to use different scaling for both the PC and monitor?
I have unchecked "use scaling for all my displays" but no matter what, I still can't change the scaling on the external display. It uses the scaling settings of my PC (scaling set to highest because of 3200x1800 resolution) and because of that, the monitor scaling shows everything large.
I have just rebuilt my PC and decided to make the jump from Windows XP to Windows 8. The jump has not been smooth, but it looks like I have ironed out my biggest problem (incorrect installation due to not creating a system partition therefore Windows 8 = no DVD, no boot; rookie error).
Now I am stuck with an issue - windows detects my only screen as more than one. Sometimes it thinks I have two, more often it thinks I have 3 or 4. Installing the current nVidia drivers did not work.
What I have at the moment is 2 generic non-PnP monitors and 2 generic PnP monitors detected, according to device manager. nVidia control panel calls them Dell SX2210, VGA display, (i can't recall the 3rd one, I'm on my laptop now), and TV. Windows display setup calls two of the SX2210, and the other two are either generic non-PnP or PnP monitors. In nVidia control panel it looks as though I have a clone setup (2 screens enabled) with the Dell SX2210 and the VGA display. In windows display setup it looks as though I have a clone setup (2 screens enabled) with Dell SX2210 and generic monitor.
I have repeatedly unchecked the secondary monitors in both nVidia control panel and windows display setup, leaving me with what should be a single monitor arrangement and 3 disabled monitors. This works until I restart my computer, which then reverts back to what it was before. This is OK as a workaround most of the time, but sometimes when I boot up the computer it detects the wrong screen as the primary and I get a monitor which goes to power save mode. The solution to this is Win Key + P and change the display selection while blind, which works now I know how. Sometimes my screen will refresh (installing programs or drivers) and I get the same result. Detect displays also results in no signal to my monitor.
Currently using DVI from the graphics card to DVI on the monitor.
System specs:
CPU: Intel 3570K (stock clock) MB: ASRock Extreme4 Z77 RAM: 4 x 4GB 2133 GSkill VIDEO: Gigabyte GTX260 OC MONITOR: Dell SX2210 @ 1920 x 1080 OS: Windows 8 64bit
*Yes I know the 260 is an older card, it has come from my old system for economics reasons (to be upgraded later), but it was working perfect on that machine.
I am on a laptop with a gtx 765m. I am trying to connect a thunderbolt port to a display port monitor. My understanding is that thunderbolt is backwards compatible with display port. But, for the life of me I can not get it to work.
I am using a minidisplay port -> display port cord.
I have ended up hooking the monitor up with an hdmi cord for now. FYI the monitor is a 2560x1440 monitor if that matters.