I was using two monitors with two ATI Radeon 4800 cards and that worked fine. I just purchased another monitor (Samsung) and now I cannot get it detected. The device manager is only showing 2 generic PnP monitors. I'm not sure why they would not be the specific monitors brands either (one Samsung and one Acer).
Any way of getting the third monitor detected? I've heard about deleting the graphics card drivers and then the monitor might be detected. But I do not know the procedure for doing that.
I'm running into this and issue and I can't peg down a cause. I recently updated my Vista HTPC to Windows 7. When booting up I can get into the OS, but the screen goes black shortly after.
I have tried two monitors: the Panasonic plasma just goes black and when I try Samsung Syncmaster LCD it looks like it's searching for digital, then analog sources.
I have an Nvidia 8400 GS card and I've tried both the 191.07 and 195.62 drivers. I've also let Windows install the drivers after uninstalling the Display Adapter in Device Manager in Safe Mode.
When I start in Safe Mode I cannot recreate the issue. I tried uninstalling/reinstalling the drivers in Safe Mode.
I'm running a ASUS P8H67-V REV 3.0 mothboard which has a VGA and a DVI-D output. When I connect a second monitor w/ a vga to dvi-d adapter, it does not detect the monitor at all. I've tested the DVI-D port with a dvi to hdmi cable to my TV, at it works as soon as I hook that up. Below is the onboard graphics info.
So I got a new monitor for xmas, the LG Flatron W1953T and I wanted to get it running with my Dell monitor. I have an onboard NVIDIA GeForce 8300 and at first just tried to plug the new monitor into the DVI port but I think it's broken or something.
Anyways, I bought this Triton Technologies External VGA video card that I just plug into a USB port, and now I have both monitors running through VGA, and my second monitor is up, but the resolution just looks terrible on the second one, and when I go to adjust resolution or whatever it says I have only one monitor and if I hit detect it doesn't come up.
In my devices menu it says I have the dell menu, and then two other "Generic non PnP monitors" even though I only have one other monitor connected.
I also go to my NVIDIA control panel and it doesnt say the second monitor is detected.
Both monitors are working, like I see my desktop background on the other monitor and I can drag stuff over to it but its just not detected or something idk. I want to download Ultramon too, (Or should I get something else w/ windows 7? I heard you cant have two taskbars w seven.
So I just upgraded from a GTX 560 TI to a GTX 670 and now my computer can't detect my second monitor. My main monitor is plugged in via DVI while my second computer is plugged in via HDMI. I've been on nvidia control panel and tried finding my second monitor there on the "multiple monitors" section, no second monitor there either.
New Build Asus M5A88-M EVO Phenom II x6 1055T 2.8Ghz Samsung 840 series 64Bb SSD Radeon HD5570 Win7 Professional 64 Bit
Annoying intemittent fault whereby the monitor is incorrectly detected as a generic model instead of an LG L192WS and the screen resolution changes to a lower setting which loses a large percentage of the desktop. I've updated to the latest AMD drivers as that seemed to work for others with a similar problem but to no avail. it takes 5 mouse clicks to open desktop and reset the display to the correct one, but after spending quite a bit to speed up my system I begrudge the time!
I have an Acer Aspire 5741-5979 using a graphics card of Intel (R) graphics. This is a new issue I have had recently. I used to connect an external monitor to my laptop, and it worked fine. The issue is that the laptop still detects the external monitor but the external monitor does not receive a signal when I plug it in the VGA (blue trapezoid connector) have played around with the display options an exhausting amount of times and nothing works there. I tried what worked before (extending the display) and trying to clone it. I thought it might be a driver issue and I reinstalled the driver, but this didn't seem to help either. I have used many monitors when it worked, and also tried different monitors after it stopped working to no avail
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
my computer is acer aspire 3630.i have windows 7 32 bit os.my philips dvd drive is now not detected by my computer.when i was installing a game,beause of low battery,my computer shutted down.when i started it again,my dvd drive is not detected.it is not showing in the device manager and bios manager.what shall i do.
I am using Micromax 300c EVDO modem ( Datacard ) for wireless broadband connection. In one machine having Intel 915 Chipset motherboard, it is working fine. In the device manager, in the modem category, it shows correctly as " CDMA 2000 Rev A Modem ". In another machine, having Intel DG41TY motherboard, the device manager does not show modem. Only in other devices category, shows as Data services. ( in ports category both diagnostic and service ports are shown correctly)
I have a Toshiba Satelite C650 (PSC12C-00M00S) with Core i3. Previously it came with 4 GB RAM. It had previously 2GB + 2GB. What I did, I bought two 4GB RAM modules and replaced them with these two. I have Windows 7 Professional 64 Bit; But the problem is it is still showing 4GB in the System Properties. I did check the RAM slots, but they seem to be fine.
OK so this is with my sisters laptop, everything was apparently fine one day & not working the next.I've read it may be a driver issue & have re-installed the correct (I think) driver version, but this hasn't fixed the problem.
I Have a internal CDRW CED-8080B it appears as CD ROM Drive in the my Computer Section. Why is it not appearing as CDRW Drive?
It does Function Opens CD etc but when i try to burn i get an error.i did use a Windows 7 Trouble shooter and i said i could not burn with the drive i was then told that the Drive did not have Burning Capabilities.I Have Burn a CD In Win XP before i upgraded to Windows 7 i dont know why Windows 7 is not detecting it and only classes it as CDROM.
So I just moved to a new school which uses ACER TRAVELMATE 6293 as their laptops. When they first gave it to me they put a brand new battery in it and they gave me the wrong charger, greeeat, but I went back and got the correct one.Now, the laptop DOES charge. When I turn it on, it displays the battery icon with the charging icon over it if that makes sense, however it has a red cross over it and if you hover over that it says "no battery is detected". So you have no idea how long it has left to charge and so on. BUT if you take out the charger, the laptop will not turn on (as if the battery is flat, but surely it cant be because it has charged for 24 hrs.)Usually, I would just go to the IT guys/girls at school but seeing as they keep stuffing up and it is school holidays, I am hoping you kind and smart people have a solution. I have tried to get any dust out of the charger port/hole thing. And have also taken out and re-inserted the battery twice. This is driving me a bit stupid as I have homework to do. So just to make things clear, the laptop works completely find when connected to the charger except for what I said before. Oh and this is happens when I am on either the home or school image. And it runs on windows 7 of course
just purchased a Gateway P7902h Laptop on the 30th - it came with 64 bit vista and I decided to upgrade the OS to Windows 7 (64x as well).
Ever since the install the laptop has been unable to detect my graphics card, a Geforce GTX 260M. Every time I open a driver installation program (Gateway site only has the drivers for Vista 64x so I tried those, then the Windows 7 64x ones from the Nvidia website.) I get a 'could not locate any drivers that are compatible with your hardware' error.
Looking through the Device Manager under Display Adapters, it only has the Standard VGA Graphics Adapter and not the actual card itself. Can anyone help?
my usb keyboard works fine on my freind pc win7ultimate32bit, i have same os service pack 1 also installed, my usb is amkette company---better than local, Problemis that it was working grt for 5 months, but for past 3 weeks it is detected as a unknown device, it is detected sometimes as hid device and works well for 3-4 hoursbut then suddenly goes off and a popup with malfunction and unknown device appers all time, then it will never that day maybe 10 min sometime. i plug out all power cables,uninstall usb controllers tried everything on internet, some techniques work but no permanent solution. i even reinstalled windows last week no luck.It works on my freind pc., diff in my freind pc is he has intel motherboard and i have biostar a785ge everything up to date,only solution that is left is to use a ps/2 to usb convertor(i have to wait for 3 weeks).
this gives me an Error "Failed to find a supported hardware rendering device. Ensure that your system the minimu requirements of Company of Heroes. Verify that DirectX is properly installed and that you have the latest drivers for your system." this annoys me! i know i have directX and it is v11. the minimu req of the game is DirectX9c better if you have the latest.
I got 2 DVD : One blu-ray SATA writer and another Pioneer DVD PATA writer. I replaced my motherboard. My new motherboard doesn't have any PATA port. I decided to try this: SATA Serial-ATA to 3.5 IDE PATA Converter Adapter Cable | eBay
This thing works, my DVD is detected in BIOS, UDMA mode 4. DVD is bootable with the proper media, it is also detected and works fine in WinPE booted from my Blu-ray.
The problem is with my Windows 7 installation. The DVD is not detected after I boot. I have to go to Device manager and click "Scan for hardware change", then the DVD-RW is detected and it works fine after that.
I tried many things, I uninstalled Daemon tools, VirtualBox, I tried a registry fix (lowerfilter, upperfilter), no good.
This has been a problem since moving to a Windows 7 64 bit machine.
I plug a HDD into an external USB and...nothing. Not listed in the computer "tree" when I go to administrative tools/Disk management it is not listed, and in the bottom tray there is no icon to safely eject the drive, as if it isnt even there. It is spinning though so I am afraid to unplug it lest I corrupt info on the drive.
I already posted this in the general discussion forum but since then I think it has evolved from a windows 7 problem to a hardware problem. When I turn on my laptop I'm brought to a Windows Error Recovery screen from which I could choose to either launch startup repair or start windows normally. Eventually both options lead to Windows loading files and bringing me to a screen with 2 more options. One I can choose the HDD/Operating system but there is nothing to choose. I can also choose to restore my computer using a system image, but none of the options given have helped me other than identifying that the HDD was the problem. I can also load drivers which brings me to the system 32 folder in my X: drive.