When I had Windows XP on one of my PC's, I was able to use my 36" Trinitron CRT as a monitor for some old school gaming, watching movies, and TV episodes with a 7pin to Component connection. It was the greatest thing for SD quality stuff, since HD stretches, lags, and gives different color quality. Now with Windows 7 my NVIDIA Driver won't let me do it the way I want to. I could do this nicely with 640x480 and 800x600 with the 60HZ refresh rate.So. I am using a Geforce 9400 GT with driver version 181.20 (i'm sure that the XP driver might have been a lower number).when I connected the cable to the TV, and opened NVIDIA control panel it immediately changed the settings to dualview without asking me. I switched the display to the TV and the only resolution that works is 720x480. 640x480 was not available on the control panel, but it was in the windows settings. In 640x480 it only gives me up to 30hz for interlaced. and when I change it to 800x600 it only offers up to 30hz for interlaced display. The higher ones which is 59 or 60hz are non-interlaced which wont support 480i.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
Is there any way i can thoroughly monitor an user account in windows 7? What the user has changed,what program has he opened,how much time has it run,when has he closed it,what other program has been he opened,......etc. and present the data in a logged format.
I've installed Windows 7 Ultimate on my MSI PR200 (MS-1221). Intel (R) 965 Express Chipset Family . Current Latest Bios Version 1.5 It works perfectly.However, I want to program my screen to shut down after 2 minutes (either plugged to AC or battery mode). But it does not work. I've installed the drivers available online, at the MSI web site. Yet, the problem is not gone.
I'm just away to get an acer revo delivered, it's for my bedroom tv, it's got linux on it, but it'll get win 7 installed on to it. now, whats the best way to install win 7 on it? would it be better to connect it up to a monitor and install 7 from there, then connect it to the tv? or connect to the tv and install 7? would it recognise the tv straight away as the monitor?
I just built my computer and am having some trouble in terms of my monitor and it's connection with my computer. The computer takes a dvi cable whereas my monitor takes a vga. They both can use hdmi but the problem is that I recently read that hdmi doesn't display an image until windows is booted so I'm currently trying to install windows from the disc without any picture on my monitor. How I can do this. I know I can use a TV or monitor with a vga or get a dvi to vga chord but I kinda want it done?
Is there a way going from a monitor 1 (laptop) and monitor 2 (flatscreen) to the TV that's hookup up with Video-S cable without having to shutdown & unplug the monitor2 and restart. When using Vista I could do this using AVI program.
suddenly my win 7 ultimate won't recognize viewsonic vg2230wm LCD monitor. Installation of driver doesn't help. It always show non-PnP monitor and max 1024x758 resolution which should be 1680x1050. Hardware id shows just default monitor. However win 7 recognize my other monitor correctly. I'm out of options.
I have a samsung ls23cmzkfvauza or call it snycmaster 2333 or 2333sw, , confusing for me also although the first posted is the model number on the tag, i am having such difficulty locating drivers,as time has passed with all the new updates for the computer the display has become very unstable, can't open a second window of ie, the first window/tab/ie is ok but the second one sort of phases in and out, mostly out, the display becomes mostly greyed out and or phases in and outgames such as wow, especially in areas with snow, or water, or a lot of sky view, fade in and out also, usually out,,,running win7 with a geforce gt610 vid card just put in yesterday thinking it may of been the video going out,
okay so I am at work and whenever I want to maximize a window to fill the left side of the monitor, it for some reason maximizes and centers in between both the monitors. Does anyone know how to fix this? I know that probably didn't make much sense, but I attached a picture to show what I am talking about. I was trying to maximize this window by dragging clicking it at the top and dragging it all the way to the top to maximize, but it centers in the middle?
My new PC with windows 7 won't detect my old Acer v243w monitor. I was hoping to use dual monitors, with this monitor connected via vga into the motherboard and my other monitor connected by DVI into the graphics card. The graphics card I am using is a Radeon HD6870. It has DVI slots and an HDMI slot but no VGA slot.
Even when the Acer is plugged into the motherboard by itself using VGA it does not get detected then either.
Is it simply a case of getting a lead which will convert my VGA lead from the Acer monitor into a DVI output which I can then plug into the graphics card?
Or perhaps there is a way of enabling VGA support, maybe via the graphics card or something? It seems odd to have a VGA socket in the motherboard if it is not going to work.
I have an ATI HD 6950 with 2 dvi monitors and 1 hdmi tv. Whenever I press the Win Key + P and choose "Computer Only" the picture goes to the hdmi output which is not what I want. I've tried messing with Catalyst trying to change the identities for the monitors but Computer Only option always goes to HDMI because it's identified as number 1. When I choose "Projector Only" the picture goes back to my 2 dvi monitors extended. How do I change this?
I want when I select Computer Only to only have picture on one of my DVI monitors and Projector Only on the TV Monitor.
My problem is that if I connect the computer via DVI to a display it functions normally, but when I hook it up via HDMI to my Flat screen, it doesn't show the splash screen that gives you the option to enter bios. The screen just stays black until the windows welcome screen(win 7 64bit) comes up, then it acts totaly normal from that point. When I try to press the del key during the boot, I believe it is entering the bios, but the screen remains black so I can't see anything. I am guessing that the MB is not sending the signal to the Flat Screen until it loads windows, but can't find out anywhere that confirms that or what to do in order to resolve it.
I'm using a Sager 3790 notebook with 2 GB of memory, a Centrino 1.7 Ghz and a Mobility Radeon 9700.Installation went perfectly. The card was detected by Windows Update and drivers were isntalled for it. They're from December 2008. Okay, that's fine and dandy.As soon as the drivers install the highest resolution available on the primary display is 640x480. Why? Because Windows won't/can't detect my primary monitor. It shows up as an "Unknown display device."I can attach an external monitor and adjust its resolution all day. I can also remote into the laptop and that works fine.I am nearly 100% confident that I cannot change the display resolution because the monitor won't detect correctly. I can't manually force an install of a monitor or change the "Unknown display device" driver because that section is grayed out in the "Advanced Settings" under the Display Resolution tab.
I'm in trouble with my monitor. I have one AOC 511Vwb LCD 15,6" Widescreen. The specs in portuguese:[URL] I've been using it... i don't know... for like 5 months, and it presented a couple of times an issue of not being recognized by my OS.
When it is recognized, i can set his native resolution to 1280x800 60Hz, but now the max resolution that i can get is 1280 x 720 and this resolution is very very weird on my screen.
So i tried to use it on two diferent notebooks, one using Windows 7 and one using Windows Vista and he worked perfectly only on the notebook with Vista.
And guess what, the monitor is "Vista Ready".
This problem happened before and solved by itself, and now passed 2 weeks and nothing to come back the way it should be. And since i've tried basicly everything i'm now "closed mind" and looking for new viewpoints.
I've been having this problem for a long time now and have found various, unreliable solutions each time. The basic problem is that my computer will turn on, the fans will turn on, the mouse and keyboard turn on, the CD drive turns on, the disk access light blinks, and everything seems to be working fine, but the monitor won't turn on. I'm using dual LG monitors, connected to one of my two NVIDIA 8600 GT graphics cards which are SLI'd together.I have about a 1 in 100 chance of the monitor actually turning on (which is to say that out of about 100 tries just turning on and off the computer, the monitor will turn on). In the event that the monitor does turn on, I get the "windows did not start up correctly" error, at which point I try to run startup repair, with varying degrees of success.
About 1 in every 5 times, a system restore back to before I installed the updates on my computer will work. For the other 4 times, system restore will either fail, or when restarting the computer, the monitor again fails to turn on. However, when this does work, the updates which I had installed end up not installed, and the next time I shut down my computer, I have to repeat the same process again.I've also confirmed that it must have something to do with the system updates. If I force shut-down my computer without installing updates, I have no problems turning it back on. I'll get the start-up message that windows did not shut down correctly, and I'll be asked if I want to start in safe mode, etc. However, picking "start windows normally" works just fine. Additionally, if instead of shutting down the computer I hibernate the computer, it also starts up just fine. The problem only arises when I click "shut down" and updates are installed. Any other form of shutting down or stand-by works just fine.
am trying to install windows on a laptop with broken screen it will not display on crt monitor when running setup.how would i change this. trying to either install vista or 7 hp dv9000 entertainment series
I've just purchased a copy of windows 7 pro and I'm attempting to install it. After some hick ups at being stuck at the blue wallpaper, I was finally able to fix the issue and click through the options to start the install. As the install starts and it gets to the "expanding files" progress bar, after a 2-3 min, my monitor goes into sleep mode or power-save mode - the screen goes goes to black and my monitor switches from DVI to VGA several times looking for a signal presumably and then just goes into standby like when I am inactive in windows xp for a few min. My computer is still running, however I am unsure whether the installation is still in progress - the dvd drive is still spinning, and the activity LED is still flashing. Some people say to connect the monitor via VGA, however I do not have a VGA port on my Gigabyte P35 DS3R Motherboard. My Video card is a Nvidia 8800 GTS 512mb, with dual DVI ports. I only have 1 monitor connected for the install.
Everytime I drag a new window of Windows Photo Viewer to my second monitor it freezes for a split second. It doesn't freeze completely, just as the window begins to enter the second monitor. After that, everything works fine. I use to be able to drag it over without any problem.
I have 2 weird things in a row this morning and I was hoping to get some opinions. I was the last one out of the office last night and the first one in this morning. I come in and my office door is locked. It is never locked and I never lock it. So once in my office I wake my computer and Resource Monitor is running on my PC and I didn't even know this program existed. So I understand Resource Monitor is a common tool that I just never noticed before but is it possible that it started itself?
Everything look fine, but when windows 7 start after welcome screen, no video on my monitor, but my led sign is blue. I change hard drives and reinstalled windows 7 several times, but the problem is still there My PC change letters on messages. i did a reset to the BIOS, but nothing new happended