I have an ASUS P6T SE motherboard with an NVIDIA GeForce GT 240 graphics card (512MB) already installed. I would like to connect 3 monitors to my PC but apparently the card only supports two concurrent ones.I use my PC mainly for web design and development, plus recreation such as watching movies, however I don't use it for gaming.Monitor sizes and inputs they support
17 inch: VGA
24 inch monitor: HDMI/VGA
23 inch: DVI/VGA
What is the best option for me to add the 3rd monitor? My main consideration is price, I don't want to spend too much money just for adding another monitor to my setup. My PC runs pretty quietly now and I want it to remain like that, so noisy cards are a big no for me. I've read about USB to DVI adapters but it seems they are not so powerful as proper graphics cards. The other option would be to add another card to my existing one, something basic that can run my third monitor. I'm not sure if this is at all possible, and what cards you suggest.And the 3rd option would be to take out my card and replace it with an ATI card that supports 3 outputs out of the box.
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I recently installed an Nvidia Quadro CX card under Windows 7 (risky, I know). Although I'm using the latest drivers from Nvidia, I can't get the second monitor to display through the displayport-to-DVI adapter.
I can swap cables all day long and each monitor works fine with the DVI connector; it's just that nothing is detected when plugged into the second output (displayport).
Has anyone else run into this problem with Windows 7 build 7000?
I installed the windows 7 on a partitioned drive. And the Os works alright but i get these random and lately consistent error that when I run the OS, My monitor does not displaying anything after the windows logo startup. Its a black screen, the monitor displays information in safe mode but not in regular mode. If I go back to windows vista everything runs fine. I need to restart the OS constantly before it finally displays the windows bar etc etc.
My graphics driver is Nvidia Geforce 8600 GT, My monitor is acer P223w.
My resolution is 1680 X 1050.
I have installed latest windows 7 update also on the driver.
I have noticed one thing that windows 7 gives me an option. To display images on two monitors but I only have one installed. So i think that might actually be the problem on boots. So i modified the setting.
ok I have no idea on this one and the good news is i can't fix it LOL.
i have an hp m9340f purchased a couple new nvidea graphics cards when I started getting the noisy fan issue .
Installed windows 7 everything worked great but was having problems with windows media center after I went to a 56k connection so i decided to format start over,
Everytime it installs the drivers for my nvidea graphics card my monitor goes black and has no signal. i tried both dif cards. same problem. I am guessing it is the recommended settings but since I only have one monitor I can't change the settings cause.. can't see to do it, and I have no other monitor to hook to to try.
I can start up in safe mode so any idea's how to fix this issue or change it all to 800 x 600 so i can see when it starts up with new drivers .....
I have used the search feature but it seems like others are having different problems with the 6200 than I am.
Whenever I install either the Default Windows 7 found driver or the Nvidia 195.62 Driver for my 6200 video card my system boots up, shows the Windows 7 startup screen and then the screen goes black like I had unplugged the monitor, the only way I can get video back in normal mode is to go into Safe Mode and disable the Video Card.
I've seen that it might be a refresh rate issue, but it won't let me change the refresh rate, the only option I have is "Use Hardware Default Setting"
Original install of Windows 7 did not support dual monitors. Symptom was primary display blue and right display black with no way out. Went back to single monitor and installed the same driver I got to work in Vista and it worked great - 158.24_forceware_winvista_32bit_english
Just tried to install an old 17 inch dell flatscreen as a second monitor and want to extend displays so that my desktop and all programs are shown across two monitors.
Selecting extend displays just shows my desktop background image on the second screen, regular desktop on the first screen. duplicate displays works as intended.
I'm on a Sony Vaio laptop with a second monitor connected with a VGA cable. I just switched from Vista to Windows 7 and my second 22" monitor is not being recognized under the resolution screen. It worked perfectly fine on vista just hours prior. I tried hitting the detect button, reconnecting the monitor, and restarting. What am i doing wrong?
It might be my graphics card, im pretty sure i have a nvidia geforce 7200m gt. cant seem to find drivers for it anywhere. My computer keeps telling me all my drivers are up to date, yet my second monitor is still not recognized. this doesn't make sense. Should i just go back to vista?
i had 2 monitors one vga one and one dvi one and i hooked it up and i had i good dual monitor setup but when ever i loaded some programs the would give me a bsod screen and restart for example fsx dark basic and others that change resolution.
now....
i re installed nvida drivers and there updated (i think)
i have the DVI monitor hooked up right now with 1 display only
i have the desktop manager software enabled and running and the Nvida control pannel running.
and what i want to now is that i want to have a horizontal span setup (reason):
ok well i have seen that FSX can run on 2 monitors and that you can have a really cool display with both displaying images can i do this on a dual monitor setup or not or do i need to get a better card (preferably not)?
What to i do now?
ok so do i go to set up multipul monitors and do it that way or is there another way to do it better theres one last question, well i herd from a friend that in the display options like when you right click in the desktop then goto properties then settings. the friend said that there was an option to enable dual monitor support. before i had my setup i did not see that option and neither do i now im guessing that i don't have 2 monitors hooked up now.
I am obsessed about getting this to work so if i am making any of you people here mad i'm sorry and Thank-You for the help that you have given me recently, btw i got my processor to work and i got it hyperthreaded..
I just got this my new pc, and i also got a monitor with it. The computer had a integrated intel video card, and an Nvidia 8500GT. Is it possible to plug the secondary monitor into the integrated intel card, and use the Nvidia for my main display?
nvidia control panel shows 2nd monitor as vga but is dvi and doesnt work it recognizes otherwise as fine in the windows settings and nvidia control panel it got reset ? after I woke it up from sleep how do i force it back?
I have two monitors, and would like to setup a different wallpaper for each monitor while retaining the expanded screen option. How to go about that? I am using a Radeon 4770 HD if that helps.
I'm setting up Dual Monitors on my HIS Radeon 4870 1GB GPU. This GPU has 1 x VGA and 1 x DVI output. I have two identical ASUS VW222u monitors. I have installed the latest ATI Catalyst Suite and drivers for this GPU and have Windows 7 up and running with BOTH monitors.
My issue, however, is my primary monitor (DVI output) is running at my ideal resolution of 1680 x 1050, but my second monitor (VGA output) can not run at this resolution. It is only allowing me to choose from a small specific number of resolutions, and 1680 x 1050 is not included (closest is 1600 x 1200, IIRC).
The ATI Catalyst Control Centre is telling me the primary monitor is my ASUS VW222u (Digital), which is correct, but my 2nd monitor is only described as a "Generic Non-PnP CRT Monitor", despite installing the correct monitor drivers etc.
Is this because I am running this second monitor of the VGA output? Will that cause my monitor to not be able to produce my desired 1680 x 1050 resolution? And will running it off the VGA output mean It thinks its a CRT?
On my Windows 7 partition I am unable to use my laptop screen and my 27" LCD Samsung TV as a monitor simultaneously. This was not an issue on my XP partition. I have updated drivers on my video card to no avail.
My TV is connected via a VGA cable. If this cable is connected when I turn my computer on, it will display on the monitor only (but at a much lower resolution than my XP partition allowed) and I am unable to switch back to my laptop monitor without restarting. My display settings wont even recognize that both are connected. Any help would be greatly appreciated.
am trying to install windows on a laptop with broken screen it will not display on crt monitor when running setup.how would i change this. trying to either install vista or 7 hp dv9000 entertainment series
Quite often I find with games on a dual monitor setup is that Id find the cursor is on the other monitor and if I accidentally click there I lose focus of the game and some games wont tab back in.
Is there any way to prevent the cursor going outside the game in fullscreen?
I tried out Windows 8 for a while. Unfortunately, many of the video games I play crashed a lot, and well, to make a short story short, I have reinstalled Windows 7. One of the features I really liked in Windows 8 is that with a dual monitor set up, my desktop backgrounds not only rotated, but were not synchronized like in Windows 7. For instance, I could have a Battlefield 3 background on one, and a Company of Heroes background on the other... while on Windows 7, if I have Battlefield 3 on one, it will be the same thing on the other display. I can't seem to work out an option to have different "unsynchronized" desktops rotating..
It may be because there are 3 DVI ports on the one card but Win 7 does not recognise all 3 monitors, only 2.
On the contrary the Nvidia control panel recognises all 3 monitors and I am able to configure all 3 screens with a res of 5760 / 1080. All is fine until I reboot and then nothing. All monitors blank. I can only assume that it's because of a conflict with the Nvidia driver (latest) and windows.
My question is, is it possible to force windows to recognise all 3 monitors?
ASUS SKT-1366 P6T Deluxe V2 Windows 7 64 bit i7 975 Extreme OC to 4.2GHZ 6gb of 1866mhz Dominator DDR3 1TB Samsung F1 SATA II Nvidia GeForce 590 GTX Acer 3D 27 inch X 3 3D Vision Surround.
I have a AMD Radeon HD 6300 series adapter, running windows 7 x64.I have a dual monitor setup,the same monitor on each port an Acer S230hl. One port is an HDMI the other is a VGA. the issue i have is that when i have a worksheet open i can see the gridlines to the cells on the HDMI connected monitor, if i move the sheet over to the vga monitor the gridlines are so light i cannot make out the different cells. it looks like one clean sheet of white paper.i tried different contrast and brightness settings and no luck.
how can i set up dual monitor display on my imac running window 7 with parallels? I have no problem when I run mac but when I run window, dual display doesn't work. I heard AND radeon has some problem with window 7 display setting.
I got motherboard: Asus m2n-mx se plus. Video Card : nVIDIA nForce 6150SE (6100-430), Onboard Audio : nVIDIA MCP61 , D-Link DFE-520TX PCI Fast Ethernet Adapter .
My Motherboard comes with Vista drivers, but I heard that even that are creating problems for video and audio. Should I download any specific drivers beforehand?
one of the netbooks was plugged in to initially power it up, but then hard-shut-down before Windows could run through its initial setup.Now when I boot up the netbook, and it attempts to go through the initial windows setup "Setup is starting services", and then throws an error message: The computer restarted unexpectedly or encountered an unexpected error. Windows installation cannot proceed. To install Windows, click "OK" to restart the computer, and then restart the installation.I restart, and the cycle continues. This happens whether I try to boot normally or in safe modeI've got no CD-drive attached to it, so I'm not sure what I can do to break this error cycle.
When I try to install some programs and drivers i get error: 'Setup will only run in administrator mode. setup is aborting.'. I only have one account which has administrator rights. I have already tried to enable and disable UAC, booting in safe mode, run as administrator, enabling original administrator account, adding permissions, taking ownership, changing compability mode and other but nothig is solving my problem. I'm running windows 7 x64 build 7000.
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.