I have an LCD TV with HDMI I like to watch movies on from my laptop. It's got an ATI HD2600 card.
I had issues setting up dual screens, extended vs clone etc. I just wanted a quick an d easy plug in to watch the movie, unplug to compute again. But would always have to adjust resolution on the TV from 1650 to 1920 manually.
Best I could do was to hook up the HDMI, disable the laptop display and it would show 1920 on the TV and when I unplug the HDMI, the display would come back on and show it's 1650. This is the only way I've figured out so far that would switch to the right resolution automaically. Plugging the HDMI back in would kill the laptop monitor and show on the TV.
i am trying to use two monitors using the hdmi to my tv what i want to do is play a video streem in full screen mode on my tv and surf the web on my laptop screen i can get this to work but the full screen mode on my tv reverts back to a small window if i click the mouse on my laptop screen.
I am trying to use my laptop with a tv as a second monitor I want to stream a live video from the internet on my extended monitor via my hdmi cable while I keep working on my laptop the problem I have is that every time I use the mouse on the laptop screen the extended screen (via the hdmi) reverts back to a small screen from full screen. How can I keep the second monitor in full screen mode while work on my laptop
My old monitor has problems turning on, I can leave it on for 30hrs before it decides to display a picture, or I can manipulate it by unplugging the DVI cable from the back, but sometimes even that takes a few tries. It's off warranty and for reference it's a Samsung SyncMaster 226BW.So I bought a new monitor and I have it plugged in using HDMI, but a problem I get is that when both monitors are plugged in the HDMI has no display until it gets to Windows, so I cannot see the boot process of my PC, because my old monitor as explained above has problems turning on. Was a hindrance today when I was tinkering with my SSD and had to go to BIOS, and frankly I'd like to be able to see in case anything goes wrong, like a BSOD or Windows update etc.
I keep my hdmi plugged in as a watch dvd's everynight through media center, onto my 32" tv.
The thing is, the sound taskbar used to automatically switch from hdmi to the laptop's soundcard automatically. It doesn't any more. Basically, if I turn off the hdmi mode on my tv, it used to switch from hdmi sound output to laptop's card. When I activate hdmi again, it would automatically switch to hdmi output.
I'm using the nvidia provided drivers via windows update.
How to extend dvi output to multiple hdmi monitors? Have Samsung Series 7 Slate tablet computer running Windows 7 Ultimate 64-bit. Nature of computer prevents swapping out graphics card/module.
I have an xfx radeon hd 6790. I am using the hdmi port and the display port for two tvs and I am using the vga (I think its called the vga) port it sez right on the box that it supports 3 monitors. I can switch between any two that I want but I cant get all three to work at the same time. I try to use all three at the same time I get "to extend the display another display must be disabled".
I connected my HP Pavilion dv4-2165dx to my TV via the HDMI slot, and worked perfectly. I disconnected the cable from the notebook when I was done and still got the main LCD display to work. When I restarted the system, the LCD display was gone, just a blank screen with a backlight. The system is booting correctly, I can hear the windows7 startup sounds and even type my password in and access my account. So far I know that:
1. The screen hardware is not damaged, because after some reboots and taking the battery off for an hour I could get the input back with no problem (with windows complaining about being unable to boot and suggesting me to run Startup repair), but got a black screen again after rebooting again.
2. I can connect the HDMI again and get the output on the TV screen, but no main display is detected. If I toggle between displays with fn+f4, only the external HDMI-TV display works.
3. I get no startup screen either. That is, not even the first screen from where I can access the BIOS. But if I connect the HDMI-TV from the beginning I get all that info on the TV.
4. I reinstalled the Graphics driver from HP with no change.
My guess is that somewhat I messed up the default configuration and that the system is only recognizing the external displays as default displays. I haven't been able to modify this from the Control Panel - Display Settings menu, and I don't know where in the registry this default startup options could be.
I have dual monitors, one on my laptop and another bigger monitor for when I am docked and at the office.
Under XP I had it set so that when I closed the lid on my laptop, the lapto screen was shut off and all files that were subsequently opend were opened to the main big screen.
It appears now that when I close the lid the screen stays on and if there was a window showing on the laptop when the lid is closed it does not automatically switch to the big monitor. It used to under XP. Is there a way to duplicate this behavior in Windows 7?
There are times when I want to use just one monitor and shut the monitor built-in to the lap top - off. Is there an easy way to do so?
I just installed a 2nd video card. It's running perfectly fine, except now on my dual monitor setup, there is a virtual space between my monitors. (This space shows on the Windows "screen resolution" tab. Practically, it means that I have to move my mouse a lot more to get it onto the 2nd screen, rather than it just being right there. I can move the screen (on the screen resoltion tab) to be right next to the other screen. I'll press save/apply and then it just pops back to the same place it was with the gap still there. I've tried doing it in the nVidia control panel as well, with the same problem (pops back). By the way, even though I installed the 2nd card, both monitors are plugged into the SAME, OLD (previously working) video card.
I am having a issue setting up 2 monitors. I think maybe I cant run one through my graphic card and my VGA port on my motherboard at the same time.... Right now I am running my monitor through a nvidia 8400 gs but I thought I could run another on my VGA port on the motherboard also.
i recently got a second 20" monitor. (both are samsung, 2 ms. very similar in specs).
and i noticed i can not use both of them in games. IE, the resolution in game wont go past 1680x 1050, which, by my understanding, is how u get it to span onto the second monitor. Any ideas? iv got an nVidia GTX260 running latest drivers. core i7 oced to 3.6.
im trying to set up dual monitors. Right now i have my main monitor as my LCD tv, and i have a vga splitter from my only vga port sending out two vga cables, one to my tv and the other to the monitor my computer came with. im trying to use the monitor it came with as my second monitor where i can do things on both, but its only cloned from my main one. when i go into control panel> appearance and personalization> display > display settings > detect , it will not find my second monitor. So do u need 2 vga ports for this to work? or is there anything i can download that would allow me to use the second monitor as its own and not a clone? this probably sounds really confusing so if i have to reword it i will, just trying to find out if im wasting my time by trying to do this
I have dual monitors that work perfectly at windows startup.After leaving the computer alone for sometime (after powersave has occured on the monitors), I arrive back to the pc to find only one monitor returns from powersave (the same monitor everytime) Rebooting the system corrects the problem; or entering the device manager and selecting "scan for hardware changes" will also correct the problem.I've updated the video card to the most current drivers possible and do not know where to go after that. It's very annoying and have the exact same thing happening with a PC at work with completely different hardware.
I am trying to buy a laptop and it has a amd a6-3400m processor, Those processors are suppose to be able to handle video. my question is does the built in gpu support 2 monitors were you can move a program back and forth between the two monitors or is it mirrored.
I came home from work today and both my monitors were completely black even though computer was up and running. Browsing through my iphone I noticed that dual monitors were a possibility. So I unplugged one and restarted with success.
I then proceeded to update my video drivers (I have a nvidia 6800) but the problem persists.
I have dual monitors with DVI and VGA.I have Realtek High Definition Audio and want to set up audio throught both. When the second monitors is connected through the rear black port the speakers of the second monitor when tested through setting but do not work when playing audio through a media player.
I've recently switched from Windows Vista to Windows 7 x64. I use the word switched instead of upgraded because I formatted my old drive instead of installing over Vista.I have been using two screens for forever now, both LG Flatron L1750HQ, both are digital and have served me well. My graphics card is an NVIDIA GeForce 8800 GTS, which has two digital outputs. I have the latest NVIDIA drivers, and I have the drivers for my screens (though they are old, June 2007, and I can't find any newer ones... which might imply an EDID problem? Dunno). I have also run every Windows 7 update known to man.To describe the problem: only one of them ever works at a time (which implies the screens themselves are okay), and both outputs work as well. To top that, Win 7 does recognize I have a dual display, but one is always in digital power-saving mode (no signal, I guess).When my computer boots up, the left screen shows the boot process, while the right one has no signal. When Win 7 boots, its the other way around.
In the NVIDIA -and- the Win 7 control panels both screens are visible, shown side by side, and set up properly with and without the screens' driver, but that does nothing to solve the problem.I've also examined my BIOS, but I doubt its anything to do with that since I haven't touched my BIOS in ages and it worked just fine on Vista. I'd wager that if I go ahead and install Kubuntu on this machine, I'll get my dual screens back, but I don't want Kubuntu.
I have two 4:3 monitors side by side. Both of them are set to 1280x1024. My desktop is extended across both.I downloaded a 2560x1024 image and set it as my background. Iwent into personalization/etc. and set it to 'Tile' as Google results repeatedly said I should.No matter what I do, tiling just makes four images appear, two on each screen.
Recently I've been having issues with my laptop and dual monitors. I have been plugging my laptop into the DVI port of external monitors. After a few days, my external monitor would lose signal if I selected to show the desktop only on monitor 2, but now, whenever I plug in my second monitor, both my laptop monitor and the external monitor go black. I've had this happen with two monitors now, and I can't think of what could cause this. Can repeatedly plugging in a monitor break it?
I have 2 Samsung SyncMaster 2243 monitors running off of a ATI Radeon 4670 graphics card.
I have downloaded some wallpapers that are wide enough to cover both monitors (large panoramas atc). But it just displays whatever will fit on my main monitor and duplicates it on the secondary monitor (This is only the case with my wallpapers, not the rest of the desktop: shortcuts, start menu etc.). I'm sure this is just a user error on my part.
edit: If I choose the "tile" option. It will ALMOST do what I want. The picture will extend across both monitors, but if it is not quite long enough/tall enough it will start the tiling.
I have to have dual monitors and 7 works great with them. One of the cool featurs of 7 is to be able to put up two reading panes side by side. If you have not seen this it is done by opeing say 2 Word docs. Pull one left and one to the right. They will each fill 1/2 the screen side by side. When I do it with dual monitors - let's just say it does not give the expected results.
Ok, so I just added Windows 7 to my lineup. After an epic battle with permission problems between XP and 7, I finally managed to fix it and have moved on to a slightly more odd yet annoying problem- my dual monitors.
The way I used to have my monitors set up was my monitor on the right had my task bar and icons... but the monitor on my left was the main monitor. That way when a program opened it would open on my left monitor (which is right in front of me) while still allowing me complete access to everything under my taskbar and all my icons to my right.
Is there any way to do that in Windows 7? When I select "Make this my main display" it moves the task bar over to that monitor, which is annoying. Any suggestions?
Note: for those of you curious on how to do this in XP or Vista- pick the monitor you want to be your main monitor (the one programs open on) and pick the one you want to have your taskbar. Let's use my setup- main monitor on the left/Task bar and stuffs on the right).
- Go to display, right click on the right monitor and uncheck "attached". This will turn it off. Now all you have is the left monitor with taskbar and stuff all over it.
- Right click the right monitor again. Check "attached" AND "Primary". Click apply. Right monitor turns back on. Everything refreshes and task bar is on the right monitor.
- Right click left monitor and click "Primary". Task bar + icons are still on the right monitor, but now all programs open on the left one.
As of right now I am running Windows XP Pro 64bit cause Windows 7 will not display dual monitors, I tried a combination of things and nothing... But xp shows them both with no problems and I didn't even have to install my driver for it! Windows 7 uli when I stall the driver cause i have the same disk that came with the video card what's the deal-le-o here
Running: on board video is Nvidia GeForce 6100 512mb PCI card with dual vga ports and 1 S-Video port Nvidia GeForce FX 5200
Not sure if ram has anything to do with this but I have 3.25gb ram but both windows 7 and xp only show 2gb I'm thinking that might have to do with that 1.25gb are not of the same as the 2 1gb sticks.