Dual Monitors, Second Monitor On/off Intermittently
Jan 2, 2010
I have two monitors connected to the ATI Radeon HD 2600 XT graphics card (it has two DVI connections). With both monitors on everything works fine. Windows remembers which apps I have open on which monitor and so on.
However, sometimes I turn on only one (the primary) monitor. In this case Windows somehow still thinks that both monitors are there, so the apps which would open on the second monitor still do so, i.e. I can't see them when I start them, and the mouse goes easily beyond the active monitor.
Now, the second monitor is completely off - it has a real power switch. But it's possible that it still has some voltage through the connector so that Windows - or the graphic card - detects its presence. I have not tried to unplug it but maybe I should do that as well.
So finally my question - is it possible to make Windows aware that the second monitor is not in use and only extend the desktop when the second monitor is on and do that without restart/log off?
I don't know if I'm doing something wrong, but I have a wallpaper that is 3200x1200 resolution, and my two monitors are 1600x1200 each. The wallpaper will only mirror on both screens instead of half being on one screen and half being on the other... How can I get this to work properly?
certain programs always open on the 2nd monitor which i dont always have switched on. For example I double click the program to run and it always opens on the 2nd monitor even if I drag it to my main monitor it always defaults back to monitor 2. Also why when I run programs in full screen (games mainly) when I exit out the folders that were open are a difeerent size to what I set them at before running the game sometimes even the folder will move to the 2nd monitor upon exit.
My old monitor has problems turning on, I can leave it on for 30hrs before it decides to display a picture, or I can manipulate it by unplugging the DVI cable from the back, but sometimes even that takes a few tries. It's off warranty and for reference it's a Samsung SyncMaster 226BW.So I bought a new monitor and I have it plugged in using HDMI, but a problem I get is that when both monitors are plugged in the HDMI has no display until it gets to Windows, so I cannot see the boot process of my PC, because my old monitor as explained above has problems turning on. Was a hindrance today when I was tinkering with my SSD and had to go to BIOS, and frankly I'd like to be able to see in case anything goes wrong, like a BSOD or Windows update etc.
I have a AMD Radeon HD 6300 series adapter, running windows 7 x64.I have a dual monitor setup,the same monitor on each port an Acer S230hl. One port is an HDMI the other is a VGA. the issue i have is that when i have a worksheet open i can see the gridlines to the cells on the HDMI connected monitor, if i move the sheet over to the vga monitor the gridlines are so light i cannot make out the different cells. it looks like one clean sheet of white paper.i tried different contrast and brightness settings and no luck.
I have a Dell laptop on a docking station running Windows 7. Everyday I place it on the docking station and use my 20 inch monitor. Is it possible to activate the dual monitor feature and use the 20 inch as the primary and use the open laptop as the secondary monitor?
I have dual monitors, one on my laptop and another bigger monitor for when I am docked and at the office.
Under XP I had it set so that when I closed the lid on my laptop, the lapto screen was shut off and all files that were subsequently opend were opened to the main big screen.
It appears now that when I close the lid the screen stays on and if there was a window showing on the laptop when the lid is closed it does not automatically switch to the big monitor. It used to under XP. Is there a way to duplicate this behavior in Windows 7?
There are times when I want to use just one monitor and shut the monitor built-in to the lap top - off. Is there an easy way to do so?
1. Having two monitors in different rooms, means that the desktop/taskbar is on one screen or another.
2. People using the keyboard/mouse in one room will affect the user in the other room.
Is there any way to setup win7 to allow different profiles for each monitor? Essentially allowing two different users on the same computer at the same time?
I was having a problem with dual monitor using 7127 build. But only when I use newest NVIDIA drivers. My tv just flashes a few times, and I can see that fish a couple of times, but then it just goes to like no signal. If I use default drivers which are installed on windows install, then it works fine, but then I dont have NVIDIA control panel and some other features.
My video card is GF 8800GTS 650M 512MB DDR3 DUAL DVI TV PCIE. I would like to know if anyone else is having this problem and If it has been fixed in the newest build, so I could be arsed to do a clean install.
I just installed a 2nd video card. It's running perfectly fine, except now on my dual monitor setup, there is a virtual space between my monitors. (This space shows on the Windows "screen resolution" tab. Practically, it means that I have to move my mouse a lot more to get it onto the 2nd screen, rather than it just being right there. I can move the screen (on the screen resoltion tab) to be right next to the other screen. I'll press save/apply and then it just pops back to the same place it was with the gap still there. I've tried doing it in the nVidia control panel as well, with the same problem (pops back). By the way, even though I installed the 2nd card, both monitors are plugged into the SAME, OLD (previously working) video card.
I am having a issue setting up 2 monitors. I think maybe I cant run one through my graphic card and my VGA port on my motherboard at the same time.... Right now I am running my monitor through a nvidia 8400 gs but I thought I could run another on my VGA port on the motherboard also.
i recently got a second 20" monitor. (both are samsung, 2 ms. very similar in specs).
and i noticed i can not use both of them in games. IE, the resolution in game wont go past 1680x 1050, which, by my understanding, is how u get it to span onto the second monitor. Any ideas? iv got an nVidia GTX260 running latest drivers. core i7 oced to 3.6.
im trying to set up dual monitors. Right now i have my main monitor as my LCD tv, and i have a vga splitter from my only vga port sending out two vga cables, one to my tv and the other to the monitor my computer came with. im trying to use the monitor it came with as my second monitor where i can do things on both, but its only cloned from my main one. when i go into control panel> appearance and personalization> display > display settings > detect , it will not find my second monitor. So do u need 2 vga ports for this to work? or is there anything i can download that would allow me to use the second monitor as its own and not a clone? this probably sounds really confusing so if i have to reword it i will, just trying to find out if im wasting my time by trying to do this
I have dual monitors that work perfectly at windows startup.After leaving the computer alone for sometime (after powersave has occured on the monitors), I arrive back to the pc to find only one monitor returns from powersave (the same monitor everytime) Rebooting the system corrects the problem; or entering the device manager and selecting "scan for hardware changes" will also correct the problem.I've updated the video card to the most current drivers possible and do not know where to go after that. It's very annoying and have the exact same thing happening with a PC at work with completely different hardware.
I am trying to buy a laptop and it has a amd a6-3400m processor, Those processors are suppose to be able to handle video. my question is does the built in gpu support 2 monitors were you can move a program back and forth between the two monitors or is it mirrored.
I came home from work today and both my monitors were completely black even though computer was up and running. Browsing through my iphone I noticed that dual monitors were a possibility. So I unplugged one and restarted with success.
I then proceeded to update my video drivers (I have a nvidia 6800) but the problem persists.
i am trying to use two monitors using the hdmi to my tv what i want to do is play a video streem in full screen mode on my tv and surf the web on my laptop screen i can get this to work but the full screen mode on my tv reverts back to a small window if i click the mouse on my laptop screen.
I have dual monitors with DVI and VGA.I have Realtek High Definition Audio and want to set up audio throught both. When the second monitors is connected through the rear black port the speakers of the second monitor when tested through setting but do not work when playing audio through a media player.
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
It seems like a few programs, such as Adobe Photoshop CS4 and Adobe Dreamweaver CS4 (seems to be limited to Adobe), expand past my main monitor, and reach the monitors to the left and right.
This happens when the programs are maximized.
I've included two screenshots of this issue. Notice the additional black border around the main screen (middle one).
I have 3 monitors set up technically although one is my 54'' TV to play movies on while I work and play. Ultimately what I would like to do is while I'm playing movies and not using the other monitors (such as laying down to take a nap or playing movies with friends) I want to have a screensaver display on the unused monitors. I'm not sure if this is actually possible with a movie up on the TV but if anyone has any idea of a program i could use
I'm also running windows 7 ultimate if that matters any to you...
I am trying to use my laptop with a tv as a second monitor I want to stream a live video from the internet on my extended monitor via my hdmi cable while I keep working on my laptop the problem I have is that every time I use the mouse on the laptop screen the extended screen (via the hdmi) reverts back to a small screen from full screen. How can I keep the second monitor in full screen mode while work on my laptop
I've recently switched from Windows Vista to Windows 7 x64. I use the word switched instead of upgraded because I formatted my old drive instead of installing over Vista.I have been using two screens for forever now, both LG Flatron L1750HQ, both are digital and have served me well. My graphics card is an NVIDIA GeForce 8800 GTS, which has two digital outputs. I have the latest NVIDIA drivers, and I have the drivers for my screens (though they are old, June 2007, and I can't find any newer ones... which might imply an EDID problem? Dunno). I have also run every Windows 7 update known to man.To describe the problem: only one of them ever works at a time (which implies the screens themselves are okay), and both outputs work as well. To top that, Win 7 does recognize I have a dual display, but one is always in digital power-saving mode (no signal, I guess).When my computer boots up, the left screen shows the boot process, while the right one has no signal. When Win 7 boots, its the other way around.
In the NVIDIA -and- the Win 7 control panels both screens are visible, shown side by side, and set up properly with and without the screens' driver, but that does nothing to solve the problem.I've also examined my BIOS, but I doubt its anything to do with that since I haven't touched my BIOS in ages and it worked just fine on Vista. I'd wager that if I go ahead and install Kubuntu on this machine, I'll get my dual screens back, but I don't want Kubuntu.
I have two 4:3 monitors side by side. Both of them are set to 1280x1024. My desktop is extended across both.I downloaded a 2560x1024 image and set it as my background. Iwent into personalization/etc. and set it to 'Tile' as Google results repeatedly said I should.No matter what I do, tiling just makes four images appear, two on each screen.
Recently I've been having issues with my laptop and dual monitors. I have been plugging my laptop into the DVI port of external monitors. After a few days, my external monitor would lose signal if I selected to show the desktop only on monitor 2, but now, whenever I plug in my second monitor, both my laptop monitor and the external monitor go black. I've had this happen with two monitors now, and I can't think of what could cause this. Can repeatedly plugging in a monitor break it?