I've recently switched from Windows Vista to Windows 7 x64. I use the word switched instead of upgraded because I formatted my old drive instead of installing over Vista.I have been using two screens for forever now, both LG Flatron L1750HQ, both are digital and have served me well. My graphics card is an NVIDIA GeForce 8800 GTS, which has two digital outputs. I have the latest NVIDIA drivers, and I have the drivers for my screens (though they are old, June 2007, and I can't find any newer ones... which might imply an EDID problem? Dunno). I have also run every Windows 7 update known to man.To describe the problem: only one of them ever works at a time (which implies the screens themselves are okay), and both outputs work as well. To top that, Win 7 does recognize I have a dual display, but one is always in digital power-saving mode (no signal, I guess).When my computer boots up, the left screen shows the boot process, while the right one has no signal. When Win 7 boots, its the other way around.
In the NVIDIA -and- the Win 7 control panels both screens are visible, shown side by side, and set up properly with and without the screens' driver, but that does nothing to solve the problem.I've also examined my BIOS, but I doubt its anything to do with that since I haven't touched my BIOS in ages and it worked just fine on Vista. I'd wager that if I go ahead and install Kubuntu on this machine, I'll get my dual screens back, but I don't want Kubuntu.
I have dual monitors, one on my laptop and another bigger monitor for when I am docked and at the office.
Under XP I had it set so that when I closed the lid on my laptop, the lapto screen was shut off and all files that were subsequently opend were opened to the main big screen.
It appears now that when I close the lid the screen stays on and if there was a window showing on the laptop when the lid is closed it does not automatically switch to the big monitor. It used to under XP. Is there a way to duplicate this behavior in Windows 7?
There are times when I want to use just one monitor and shut the monitor built-in to the lap top - off. Is there an easy way to do so?
I'm using Windows 7 x64 Ultimate. I have my power options set to turn off the monitor after 20 minutes and the computer is not set to sleep or hibernate. When over 20 minutes has expired, the monitors do not turn off. On rare occasion they do, though (by 'they' I am referring to dual monitors).
I Googled this issue and tried disabling "Allow this device to wake the computer" on the properties page for that network device in device manager. This didn't work. What could be preventing my displays from automatically turning off? Note that during this time I have torrents running, so I'm not sure if it is indeed still the network connectivity that is causing this issue. I never had this kind of problem in Windows XP.
I just installed a 2nd video card. It's running perfectly fine, except now on my dual monitor setup, there is a virtual space between my monitors. (This space shows on the Windows "screen resolution" tab. Practically, it means that I have to move my mouse a lot more to get it onto the 2nd screen, rather than it just being right there. I can move the screen (on the screen resoltion tab) to be right next to the other screen. I'll press save/apply and then it just pops back to the same place it was with the gap still there. I've tried doing it in the nVidia control panel as well, with the same problem (pops back). By the way, even though I installed the 2nd card, both monitors are plugged into the SAME, OLD (previously working) video card.
I am having a issue setting up 2 monitors. I think maybe I cant run one through my graphic card and my VGA port on my motherboard at the same time.... Right now I am running my monitor through a nvidia 8400 gs but I thought I could run another on my VGA port on the motherboard also.
i recently got a second 20" monitor. (both are samsung, 2 ms. very similar in specs).
and i noticed i can not use both of them in games. IE, the resolution in game wont go past 1680x 1050, which, by my understanding, is how u get it to span onto the second monitor. Any ideas? iv got an nVidia GTX260 running latest drivers. core i7 oced to 3.6.
im trying to set up dual monitors. Right now i have my main monitor as my LCD tv, and i have a vga splitter from my only vga port sending out two vga cables, one to my tv and the other to the monitor my computer came with. im trying to use the monitor it came with as my second monitor where i can do things on both, but its only cloned from my main one. when i go into control panel> appearance and personalization> display > display settings > detect , it will not find my second monitor. So do u need 2 vga ports for this to work? or is there anything i can download that would allow me to use the second monitor as its own and not a clone? this probably sounds really confusing so if i have to reword it i will, just trying to find out if im wasting my time by trying to do this
I have dual monitors that work perfectly at windows startup.After leaving the computer alone for sometime (after powersave has occured on the monitors), I arrive back to the pc to find only one monitor returns from powersave (the same monitor everytime) Rebooting the system corrects the problem; or entering the device manager and selecting "scan for hardware changes" will also correct the problem.I've updated the video card to the most current drivers possible and do not know where to go after that. It's very annoying and have the exact same thing happening with a PC at work with completely different hardware.
I am trying to buy a laptop and it has a amd a6-3400m processor, Those processors are suppose to be able to handle video. my question is does the built in gpu support 2 monitors were you can move a program back and forth between the two monitors or is it mirrored.
I came home from work today and both my monitors were completely black even though computer was up and running. Browsing through my iphone I noticed that dual monitors were a possibility. So I unplugged one and restarted with success.
I then proceeded to update my video drivers (I have a nvidia 6800) but the problem persists.
i am trying to use two monitors using the hdmi to my tv what i want to do is play a video streem in full screen mode on my tv and surf the web on my laptop screen i can get this to work but the full screen mode on my tv reverts back to a small window if i click the mouse on my laptop screen.
I have dual monitors with DVI and VGA.I have Realtek High Definition Audio and want to set up audio throught both. When the second monitors is connected through the rear black port the speakers of the second monitor when tested through setting but do not work when playing audio through a media player.
I am trying to use my laptop with a tv as a second monitor I want to stream a live video from the internet on my extended monitor via my hdmi cable while I keep working on my laptop the problem I have is that every time I use the mouse on the laptop screen the extended screen (via the hdmi) reverts back to a small screen from full screen. How can I keep the second monitor in full screen mode while work on my laptop
I have two 4:3 monitors side by side. Both of them are set to 1280x1024. My desktop is extended across both.I downloaded a 2560x1024 image and set it as my background. Iwent into personalization/etc. and set it to 'Tile' as Google results repeatedly said I should.No matter what I do, tiling just makes four images appear, two on each screen.
Recently I've been having issues with my laptop and dual monitors. I have been plugging my laptop into the DVI port of external monitors. After a few days, my external monitor would lose signal if I selected to show the desktop only on monitor 2, but now, whenever I plug in my second monitor, both my laptop monitor and the external monitor go black. I've had this happen with two monitors now, and I can't think of what could cause this. Can repeatedly plugging in a monitor break it?
I have 2 Samsung SyncMaster 2243 monitors running off of a ATI Radeon 4670 graphics card.
I have downloaded some wallpapers that are wide enough to cover both monitors (large panoramas atc). But it just displays whatever will fit on my main monitor and duplicates it on the secondary monitor (This is only the case with my wallpapers, not the rest of the desktop: shortcuts, start menu etc.). I'm sure this is just a user error on my part.
edit: If I choose the "tile" option. It will ALMOST do what I want. The picture will extend across both monitors, but if it is not quite long enough/tall enough it will start the tiling.
I have to have dual monitors and 7 works great with them. One of the cool featurs of 7 is to be able to put up two reading panes side by side. If you have not seen this it is done by opeing say 2 Word docs. Pull one left and one to the right. They will each fill 1/2 the screen side by side. When I do it with dual monitors - let's just say it does not give the expected results.
Ok, so I just added Windows 7 to my lineup. After an epic battle with permission problems between XP and 7, I finally managed to fix it and have moved on to a slightly more odd yet annoying problem- my dual monitors.
The way I used to have my monitors set up was my monitor on the right had my task bar and icons... but the monitor on my left was the main monitor. That way when a program opened it would open on my left monitor (which is right in front of me) while still allowing me complete access to everything under my taskbar and all my icons to my right.
Is there any way to do that in Windows 7? When I select "Make this my main display" it moves the task bar over to that monitor, which is annoying. Any suggestions?
Note: for those of you curious on how to do this in XP or Vista- pick the monitor you want to be your main monitor (the one programs open on) and pick the one you want to have your taskbar. Let's use my setup- main monitor on the left/Task bar and stuffs on the right).
- Go to display, right click on the right monitor and uncheck "attached". This will turn it off. Now all you have is the left monitor with taskbar and stuff all over it.
- Right click the right monitor again. Check "attached" AND "Primary". Click apply. Right monitor turns back on. Everything refreshes and task bar is on the right monitor.
- Right click left monitor and click "Primary". Task bar + icons are still on the right monitor, but now all programs open on the left one.
As of right now I am running Windows XP Pro 64bit cause Windows 7 will not display dual monitors, I tried a combination of things and nothing... But xp shows them both with no problems and I didn't even have to install my driver for it! Windows 7 uli when I stall the driver cause i have the same disk that came with the video card what's the deal-le-o here
Running: on board video is Nvidia GeForce 6100 512mb PCI card with dual vga ports and 1 S-Video port Nvidia GeForce FX 5200
Not sure if ram has anything to do with this but I have 3.25gb ram but both windows 7 and xp only show 2gb I'm thinking that might have to do with that 1.25gb are not of the same as the 2 1gb sticks.
I just installed Windows 7 pro 64 bit on the system I built tonight, and only one of my monitors will work. I installed the latest drivers for my ATI 5770, and windows is recognizing both monitors, but only one comes on. The other stays in standby mode. I made sure both monitors are enabled.
If I switch the cables in the back of the video card, then the opposite monitor works. I can also drag monitor 2 to the left of monitor one, and it takes the taskbar away from the working monitor as expected, moving it to the monitor that won't turn on. Has anyone seen this issue before?
Odd situation. I have two video cards a geforce4 MX 4000 and built in Intel Q965/Q963 installed in a HP dc5700 with 1GB of RAM running Windows 7 Enterprise fresh install. The onboard Intell has the latest drivers from HP and the nVidia I got working using the 81.98 forceware driver.
If I select the nvidia card as the main video card in the BIOS the display will show up on just one monitor (the one plugged in to the card), if I look in the device manager it sees both cards w/o errors yet I have no option for dual display, if I click on "detect" it finds nothing. Also all the nvidia software runs fine.
However if I select the onboard card as the default then the display goes on just it's monitor but this time in device manager the nvidia card has the "windows has stopped this device because it has reported problems (code 43)" message and obviouly no option for dual monitors.
It would seem both drivers and monitors are working since they work independantly, it's just that Windows7 doesn't seem to want them to play together. Also this situation worked fine when pc was WinXP.
I recently upgraded Windows 7 pro and am running 2 monitors an old 17" Tatung LCD and a 24" Samsung 245BW.
Previously while running XP and XP64 i was able to choose a mode in my nvidia control panel that says configure monitors independenty. this option is no longer available.
What is happening is that when i run a game full screen, my second monitor flickers then adjusts for the resolution of the game that i am playing skewing the view of that screen. when i exit out it reverts to normal.
since i am using extended desktop i believe that it essentially is attempting to force the resolution onto my second screen. previously in xp and xp64, when entering the game at an adjusted resolution, due to the screens being "separately configured" the primary monitor adjusted resolution while the 2nd remained perfect.
I am wondering if there is a way to "configure" these monitors independently or possibly lock or separate my secondary screen from being attacked to my primary screen.
I have tried using ultramon but i could not get it to do what i wished. I may be totally mistaken or blinded by a simple fix, or possibly it may not be a feature availabe in 7. any advice would be extremely helpfull.
Recently i updated my video card to a PNY geforce 8400gs (PCI) card, i have two monitors hooked up, Primary monitor is a 22" LG with a native resolution of 1920 x 1080, my secondary monitor is an ACER 22" with a native resolution of 1680 x 1050.
I am having 2 problems with the this setup.
1. When i ever i play videos online at fullscreen it becomes very choppy (did not do this with old 128mb embedded card)
2. When i try to set the 2nd monitor to its native resolution it will shift the viewing area over to the left by a view inches. I have tried to controls built into the monitor and nvideas control panel but nothing will move it over enough.
I have two monitors connected to the ATI Radeon HD 2600 XT graphics card (it has two DVI connections). With both monitors on everything works fine. Windows remembers which apps I have open on which monitor and so on.
However, sometimes I turn on only one (the primary) monitor. In this case Windows somehow still thinks that both monitors are there, so the apps which would open on the second monitor still do so, i.e. I can't see them when I start them, and the mouse goes easily beyond the active monitor.
Now, the second monitor is completely off - it has a real power switch. But it's possible that it still has some voltage through the connector so that Windows - or the graphic card - detects its presence. I have not tried to unplug it but maybe I should do that as well.
So finally my question - is it possible to make Windows aware that the second monitor is not in use and only extend the desktop when the second monitor is on and do that without restart/log off?
I've installed Windows 7 yesterday and only one of my card seems to be working as it is the only one who displays something.
The weird thing is that when I plug a monitor in the DVI port in the other card, I hear that DING song that signals that hardware has been connected but nothing shows up.
Any ideas what could possibly be the problem?
Obviously, the three monitors were working under Vista.