I have an external monitor, actually a TV connected via S-Video cable coming out of the back of my video card. My TV is my second desktop. All of that is fine. The problem comes in when I log off or "switch users" (there are no other user profiles on my computer - I only log off or switch user to make sure the kids can't get on without my permission - they don't know my logon code), the TV loses the picture and goes grainy, but when I log back in, the TV does not "kick in" again. The only way to get the TV to display the second desktop after logging out or switching users seems to be to restart the computer entirely. If it reboots, it picks up the TV as second monitor again.
If you simply log in again, it does not pick up the TV as second monitor. Is this related to some setting that says the TV should only be the second monitor on the admin's account and not for "all users" or something? It is annoying to always be restarting my computer to watch a video on my TV. By the way, in all cases, the "Display" settings in control panel show the TV as the second monitor. But looking at the actual TV, all you see is a grainy screen that obviously is not being fed any clear signal.
My old monitor has problems turning on, I can leave it on for 30hrs before it decides to display a picture, or I can manipulate it by unplugging the DVI cable from the back, but sometimes even that takes a few tries. It's off warranty and for reference it's a Samsung SyncMaster 226BW.So I bought a new monitor and I have it plugged in using HDMI, but a problem I get is that when both monitors are plugged in the HDMI has no display until it gets to Windows, so I cannot see the boot process of my PC, because my old monitor as explained above has problems turning on. Was a hindrance today when I was tinkering with my SSD and had to go to BIOS, and frankly I'd like to be able to see in case anything goes wrong, like a BSOD or Windows update etc.
how can i set up dual monitor display on my imac running window 7 with parallels? I have no problem when I run mac but when I run window, dual display doesn't work. I heard AND radeon has some problem with window 7 display setting.
I assume something is polling the web or keeping my monitors alive but I can't figure it out.nothing has changed in my setup other than a possible windows update but in the last few weeks my screensaver stays on 24/7.I played with it today and got it to shut off but i set the screen off time at 2 minutes, just 1 minute after the screen saver.I am using ultramon on a dual monitor setup. and for a screensaver I am using gPhotoShow. (allows separate/distinct slideshows for each monitor) as I have been for several years.my assumption is something is now polling the net or doing some activity that is preventing the monitor from sleeping. So I'm wondering if there is a logging program that I can use to track this down. i did hijack this but that isn't helping (as far as I can tell)
the display/monitor uses the generic driver for display and it is 1024×768 or 800x600 but it looks very odd it should be bigger because my screen is bigger. I have a acer monitor but is there something I can download to fix the display size?
I was having a problem with dual monitor using 7127 build. But only when I use newest NVIDIA drivers. My tv just flashes a few times, and I can see that fish a couple of times, but then it just goes to like no signal. If I use default drivers which are installed on windows install, then it works fine, but then I dont have NVIDIA control panel and some other features.
My video card is GF 8800GTS 650M 512MB DDR3 DUAL DVI TV PCIE. I would like to know if anyone else is having this problem and If it has been fixed in the newest build, so I could be arsed to do a clean install.
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
Windows 7 64 turns off the monitor 1 signal after startup. At login, it powers off monitor 1 and powers on monitor 2. Windows 7 behaves as though dual display is working even though monitor 1 does not display. It's the same for single display, the monitor will not work on cable 1 except at startup.Both 'Extend these displays' in Windows Screen Resolution and 'Extend' in the Nvidia Control Panel is selected. And changing which one is primary does not turn on monitor 1. Nor does switching the position of display 1 in either (ex.left/right/top/bottom).GeForce 8400 GS; Dual Monitor Solution 59 pin (DMS-59) to 2 VGA adapter -tried two of these adapter cables.atest Driver: 306.23 - installed a clean reset install of the latest Nvidia driver released last week.Same Resolution: 2 of the 4 monitors I have tried have the exact same 'recommended' resolution (1280X 1024), both 60Hz, and even chose 16 bit color for both instead of 32 to reduce resources.
Linux worked: Dual display worked immediately when I tried it in Ubuntu 10. So, it's not the hardware. It works in Ubuntu 12 too but not properly -it won't transfer windows across displays. My Windows 7 64 is an upgrade from Vista 64. One person in another forum with the same problem resolved theirs by reinstalling Windows 7. But another got the same problem only after a fresh clean install of Windows 7 64 with the same GeForce 8400 GS and DMS-59 when it had previously worked in Vista. Dual Monitors - Only One Works at a Time
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have 2 displays, 1 is a TV 3 meters away hooked up to my graphics card via HDMI and the other other is obviously my PC monitor. I was hoping there would be a way to have sound exclusively on the TV while I have seperate audio on my monitor?Sorry it's a bit hard to explain. Basically I want to play a game on my PC with my headphones on while a film is playing on the TV, but if I try it atm the game shows on my monitor while the film is playing, but the sound of both is coming out of the TV.Graphics card is a HD6870, using Windows 7 x64.
I just got a new 9800gtx+ . i have a 22" AOC lcd screen. whenever i start my pc , the motherboard screen appears but when the windows loading screen comes my monitor shows the message input not supported I have updated my drivers and checked fastpccheck.weebly.com. but no help. After that screen the user login screen comes to display and then everything is normal. i cant figure out what is making the problem.
I have a desktop running Windows 7 Ultimate and a laptop running XP Pro. Is there any way to use the laptop as a second monitor? I know how to set it up physically but just don't know if the laptop has to be running the same OS or not.
I have a ATI Radeon 7500 and i can't let windows 7 Build 7600 find it. It says under display monitor its a Standard VGA graphics adapter. I'm getting really annoyed because i can't play counter-strike...and i installed many different drivers for it but it just doesn't work. and i also wanted the Control Center but idk how as well. I upgraded today to windows 7 from XP...and dumb me didnt backup ANYTHING after a full clean wipe. So im getting really agitated and not patient about it. please help!!
and as well with my monitor display says Generic pnp Monitor and i can't find the drivers for my monitor anywhere! even on the gateway site. I have a 19" Gateway LE HD widescreen.
The only way I could shut acer down was unpluging my computer. The moniter said "no something, I'm sorry but I forget. Is there a way with the keyboard to shut it down? I called my provider and they said it was a moniter or card problem, but it works good since i plugged it back in. I have window 7.
So, I got a Benq G2400WD and a 32 inch LCD TV connected to my computer, to my ati 4890 graphic card, and i wonder if i could set a different font size/dpi on my LCD TV than whats on the Benq?
In Windows CP Display Settings (and in Nvidia CP) my left/main display is 1 and right is 2. In Windows CP Color Management the left is 2 and the right is 1. I would ignore this except that at least a couple applications now open on opposite displays. One is a monitor color calibration program (which also identifies them reversed) and the other a Raw Image Converter. And I don't trust that the calibration program loads the profiles separately.This happened after I reinstalled Nvidia video drivers trying to fix random bsod from firefox and IE8 - which seems to have worked.
i have a Sapphire hd 7850 graphics card and two monitors connected to it one hdmi one dvi-d. both screens are showing the same image and when i go into screen resolution it shows only one "generic PnP montitor". this was a fresh install of windows seven and it did work before that. the driver for the graphics card is saying "standard VGA adaptor" is this normal
Can't get monitor to display same resolution between 2 workspaces - Microsoft Community Copy/paste of text in OP (posted Nov 21, 2012): Ok, so here's my setup:HP EliteBook 6930p with ATI Radeon HD 3450 graphics card. Win 7 Pro 32 bit installed.I have 2 workspaces, both with docking stations.Workspace 1 (primary work site) had 24" wide (primary monitor) and 19" wide monitors.Resolutions that Windows let me set were 1920x1080 and 1440x900, respectively. These monitors were also hooked to the docking station via DVI and VGA cables, respectively.Workspace 2 had a 19" standard monitor (1280x1024, VGA, set as primary) and then I would open the laptop beside it for the dual monitor (1280x800).Today I have pulled the 19" wide from workspace 1 to replace the 19" standard at workspace 2. Windows, however, will only let me set the resolution to 1280x720 (still using VGA). Even taking off the "hide modes" option or going into the ATI Catalyst Control center to try and force it does not work. I've also tried switching between which one is primary or even making it single monitor display (on the 19" wide). 1440x900 is simply not available.
Windows 7 installed without any problems, and found all the hardware devices, but the only problem I have is that the monitor will only dislplay at 1920x1080. If you lower it , the monitor will not display a full screen. Tried all resolutions, and have the latest 191 drivers. Any ideas..
Windows 7 Home Premium - Fully updated as of today NVIDIA GeForce 9200 - Fully updated driver to the newest one released
When I look under windows display settings or NVIDEA settings there is not an option for a second monitor any more. There used to be. Now that I got a VGA spliter, there is not. I have tried a combination of everything from direct computer to 3 different monitors, all work perfectly, but when I attached 2 monitors with the spliter, all I get is the same image on each monitor.
When I hit detect, nothing happens, both monitors are identified as #1. Have rebooted, have updated, have done just about anything I can think of. Why would windows not show a second monitor option now?? Spliter and monitors work perfectly with the laptop and second older/slower computer. But on this better computer, faster and just should be a hell of a lot better computer, I am not getting the option for a second monitor.
Friend has a dual monitor and wants to play a game across both of them, the problem is the option doesn't appear I think it might be the new driver he installed. I tried installing Hydravision which has an option to span across displays but it doesn't work can anyone help?
I have just built a new machine the last three weeks and finally last night I was ready to complete my final transfer from old to new computer. That meant adding my 2nd monitor to my new machine.
My graphics card is a GeForce 9400gt. I have the latest drivers from Nvidia's web site. New machine with Windows 7 has been working for 2 weeks with a single monitor.
Both monitors are fairly new. The graphics card has one DVI and one VGA input so I must set up my dual monitors that way,
My problem is when I hook up my 2nd Monitor I get the BSOD. I will spend 1 more night trying to fix it but that is it.
So my questions are as follows:
1. Does anymore have any ideas why The BSOD comes up when my 2nd monitor is plugged in?
2. Does anyone using Windows 7 have a working graphics card with a dual monitor set up that is working? If so, what card is it? I may be buying one tomorrow.
i had 2 monitors one vga one and one dvi one and i hooked it up and i had i good dual monitor setup but when ever i loaded some programs the would give me a bsod screen and restart for example fsx dark basic and others that change resolution.
now....
i re installed nvida drivers and there updated (i think)
i have the DVI monitor hooked up right now with 1 display only
i have the desktop manager software enabled and running and the Nvida control pannel running.
and what i want to now is that i want to have a horizontal span setup (reason):
ok well i have seen that FSX can run on 2 monitors and that you can have a really cool display with both displaying images can i do this on a dual monitor setup or not or do i need to get a better card (preferably not)?
What to i do now?
ok so do i go to set up multipul monitors and do it that way or is there another way to do it better theres one last question, well i herd from a friend that in the display options like when you right click in the desktop then goto properties then settings. the friend said that there was an option to enable dual monitor support. before i had my setup i did not see that option and neither do i now im guessing that i don't have 2 monitors hooked up now.
I am obsessed about getting this to work so if i am making any of you people here mad i'm sorry and Thank-You for the help that you have given me recently, btw i got my processor to work and i got it hyperthreaded..
Anyway i have 2 22" LCD Monitors and when i plug them in. Windows 7 does this weird thing were when you select the option to extend the desktop spanning from left to right. Instead of simply scrolling from the left screen to the right screen with your hand as you usually would with any OS, Windows 7 BETA somewhat reverses it were you have to scroll to the left from your 1st monitor to get the 2nd monitor. (sorry if this is really confusing im having a differculty writing this also haha)
And if you select your monitor on your left to be your primary monitor, it will wont do that it will move the it to your right monitor instead so i decided to physically switch the monitors around on my desk to get it normal. but i don't like the idea of my computer telling me what to do hahaha.
I know its a beta OS but its still based somewhat on vista. ive got the latest drivers for my graphics card and my graphics card is a NVIDIA 260GTX so its pretty meaty and it can do dual monitor fine on other os's.
I've searched the internet far and wide as well as these forums and have not been able to find the answer. Unfortunately, my sister-in-law's dog killed my 2nd monitor so I haven't been able to test on my own since installing the RC version last week. (new monitor will arrive next week)
My question:
If you RDP in Windows 7 from a computer with dual monitors to an XP computer with dual monitors is the multimon support available?
I want the session to feel as though I'm at work with dual monitors.... not just spanning the 2 monitors. My work computer runs windows xp.
Currently the only way to get this functionality in xp is to use the span command and to use add-in 3rd part software that does not work that great.