W7 Professional is installed and working great except that I cannot get my second monitor to work.
Worked fine in XP using a Tritton UV100 USB2 adapter.
Downloaded new W7 drivers from Tritton, installed them OK, and the second monitor is listed on the devices page as a USB2 VGA device, but in the column for Unspecified Device, and does not show up in the Control Panel display where screen is normally extended to 2 monitors.
Have tried all combinations of connecting monitor first, then drivers; or drivers first, followed by monitors. It has to be something really simple that i am missing.
I'm running a ASUS P8H67-V REV 3.0 mothboard which has a VGA and a DVI-D output. When I connect a second monitor w/ a vga to dvi-d adapter, it does not detect the monitor at all. I've tested the DVI-D port with a dvi to hdmi cable to my TV, at it works as soon as I hook that up. Below is the onboard graphics info.
OK as Ive been looking around the site Ive noticed that there are many issues like this one, and Im sure there are plenty of people out there that have this problem.
My "Windows Experience Index" is at a 1.0 only because my Graphics, and Gaming Graphics rating are at 1.0. Everything else is at 5.9 which I can live with.
When I look at my adavanced settings for my monitor it Shows my Nvidia GPU but only running with 14 MB total avail mem. And I know that NOT right. Unfortunatly when I went to Nvidia's site and ran the auto detect it said that I didnt even have their product and that I had the Standard VGA Graphics Adapter.
I can't seem to find the right drivers for this to fix it all, and I would love to be able to "Experience" Windows 7 at its fullest. Any help would be greatly appreciated. Hopefully the attached photos will give you an idea as to what Im working with. Also When i toubleshoot Aero Ive received this message:
"The current video card may support Aero with a driver that is compliant with the Windows Display Driver Model (WDDM). Contact the manufacturer of your computer or video card for a WDDM-compatible driver."
And when I try to change my screensaver to a 3D one it states that I either need a newer graphics card or one that is compatible with Direct3D.
For some reason my USB 3.0 4-port hub is being seen as a USB 2.0 hub. I've tried unplugging the usb 3 cable and rebooting and reattaching the cable to my USB 3.0 Card, but it still sees as a USB 2 hub
I have an asrock 775dual-VSTA and I recently upgraded to Windows 7 Ultimate, 32bit.
Things seemed fine, but I realized my USB was performing very slowly.
For example, it takes hours and hours just to upload a few albums to my iPod.
I presume it's a motherboard driver problem. Does anyone know of a fix for this? I dont see any Windows 7 drivers from Asrock for this Mobo.
Im *really* hoping I dont have to buy a new one.
Here are some screenshots from my Device Manager. As you can see, USB seems to be fine, but there are those unknown devices that I have no idea what they are. As far as I can tell, everything is working fine except the USB speed so I presume these unknown devices are related?
I am running a via chipset which was working fine on Vista. I have now installed windows 7 and all of my front connected and motherboard mounted USB 2 ports are now working as USB 1 (seriously making my iPhone syncing life a misery).
If needed I will post my motherboard etc when at home, as can't remember off top of head.
I have tried flashing BIOS and installing drivers for chipset (those which would work on 7 anyway) but Via USB seem to be coming through windows update since XP, so not much joy there as nothing on 7 WU.
Any Ideas, as I thought there would be more about this issue out there?
i recently bought a USB3 2.5" 1TB toshiba drive and it works fine on my USB3 ports but it doesn't work on my USB2 ports. Non of the people i have asked seem to know why. i Have 3.5" usb3 2TB that works on both USB3 and USB2, so the only thing i can think of is that the 2.5" draws more power from the USB3 ports thus allowing it to run.
I connect an external device that supports USB3.0 to an USB3.0 port on the computer/mother board using an USB 3.0 cable. I can see in the device manager that there are (intel) drivers that support USB3.0. The external device (WD mypassport) advertises USB 3.0 support. Is there a way to find out in windows 7 (home premium) if the actual usb connection is really using USB3.0 or if it is falling back into USB2? Is the transmission speed in the USB connection the only hint about what is happening in the USB connection. That is, a higher speed means that USB 3.0 is really being used?
I have a Windows 7 installer DVD, but I would like to transfer the files to a USB 2.0 flash stick to install from there instead of a DVD. Can someone post a link to instructions for this and the software I need to put the files on the flash and make it bootable. There seem to be a number of applications that do this, but I would like to know that I am using something that is safe and proven to work well.
I know there are several threads but they are with different cards and they didn't solve my issue...
I bought a new MEDION AKOYA P5334E and the system arrived installed normally with Windows 7 home edition.
I've installed on the computer the Windows 7 enterprise edition (x64) and many drivers could not be find. I've installed nost of them, except the display adapter. It's stuck on "standard vga graphic adapter". I have the AMD Radeon HD7570 card but no matter what I try is not working! tried to uninstall the current driver, tried to download the driver from the AMD website (when executing it - it's running but does nothing). when executed the auto detect from AMD website, I got the message "we were unable to find your product or OS"
I have a Western Digital 1 TB USB 2.0 that I have connected to a computer that has Gigabyte connection to my NAS which can handle Gigabyte speeds.I am only getting 30.1MB/s.Is there any settings I can do to windows so that it can send faster?
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
I got a Bluetooth USB adapter (model number is ES-388, no brand) so that I could use my Bluetooth A2DP headset with my Windows 7 PC, but I can't seem to get it to work. Windows does not recognize the device (the adapter), and won't activate it.
Is there a driver that I need to install? The adapter itself came with nothing...
trying to find a working driver for it on windows 7. My little 3 year old loving beast snapped my cd in half and i don't know where i got the product from. on the adapter it says 4-ch usb dvr.
I just installed Windows 7 RC1 32-bit OS onto my laptop. Without thinking i needed any drivers, i went to play World of Warcraft. It told me that it could not locate a display device and wouldn't work. I figured i needed to install my grpahics card drivers for my Nvidia 9 series card. When i went to install the drivers, an error message appeared saying that it could not locate drivers compatible with my device.
I went to dxdiag and under display it shows that i have a standard vga adapter and won't recognize my actual graphics card. I'm not sure if i'm missing drivers, i haven't installed anything else except for this attempt at installing the nvidia driver.
I am someone that needs to upgrade my video adapter to get windows 7 on my new 320 GB hard drive. I currently have a VIA/S3G Unichrome Pro IGP video adapter. It says that I need to get it compat[IMG]file:///C:/DOCUME%7E1/JOHN/LOCALS%7E1/Temp/moz-screenshot.png[/IMG]ible with windows Aero support. Can anyone tell me what I need to do to make this happen. I just installed a second hard drive 320 GB and my first 80 GB has windows xp installed. This is a machine that has a celeron processor 2.53 Ghz. Enpower machine built by PC Club when they were in business. I just maxed my memory out at 2 GB. It has been a good machine.