Graphic Cards :: VGA Connector Works On Motherboard But Not On GPU
Aug 1, 2014
I recently built my 1st gaming PC (Intel Core i7-3770, EVGA GeForce GTX 770, NZXT Phantom 240 - System Build - PCPartPicker Canada) , and had a couple problems with the display while the GPU (GeForce GTX 770) was plugged into the motherboard (MSI Z77A G41) which then I fixed... or so I thought... I got it to work through the motherboard, so now the display works on the monitors and I can start up the computer and install drivers for the GPU and see that the computer recognizes it in "Devices Manager" panel, but as soon as I plug the VGA connector into the GPU it shows no display once again... literally nothing, the light in the monitor just starts flashing after a while...
I have tested 2 different monitors which gave me the same result, and I tried playing the the BIOS around but did not find anything that fixed the problem... so my question is does my GPU work even if it shows no display when I connect the VGA connector into the graphics card? If I was to install a game would I be able to run it at high settings or would it run in the integrated graphics card (Intel HD 4000)...
I made a post before about the problem that I had before this, which was that the graphics card did not show display at all when plugged into the mother board... [URL] ....
I have three monitors setup in Windows 8. Two of them to a Radeon 4670 and the third to the onboard HD graphics 4000. They work quite well in Windows 8. When I upgraded to 8.1 the HD 4000 monitor no longer comes up. The adapter is recognized and device mangler says it is working and has the most recent driver.
This is important to my work flow and is just one of a number of things not to love about 8.1. I would like to get this functioning before I tackle less important issues. Win 8.1 is installed on a test drive.
My computer is hp envy dv6 7280sf notebook with 2 GPUs intel graphics HD 4000 and Nvidia Gefoce 630 M running windows 8 64x.
After making an free upgrade online for windows 8.1 which run successfully, both gpus were ok...
But after receiving a windows update with new drivers for intel graphics 4000 gpu , after install and restart , the computer freezes on blank black screen at windows startup with no mouse or anything only backlight.
So safemode was succeful but i figured that the update for windos update was the cause and graphics drivers were installed so i removed the drivers for intel graphics and itried to reboot the computer so the restart was succeful but the intel graphics card was unistalled and no drivers were automatically installed so i tried to download and install the drivers from hp and intel website for windows 8.1 however the install fails.
Automatic install from device manager failed both online and from computer and even from manual driver selection.
So therefore the install failed in safe mode , and the high def audio device ( on screen audio possibly) failed and the device was not recongnized (no drivers installed) so i'm stuck without anydrivers for the integrated gpu and the games are no longer able to run normally even if the nvidia card is on.
I tried downloading an installing Windows 8 drives but it failed to install also , for instance any new install of drovers seems to fail for me even for external usb hard drive.
Here is the problem... everything boots just fine and runs with 2 of the GPU's installed. I have the latest AMD 13.12 drivers, and the AMD APP SKD installed. When I add the 3rd GPU though, windows will auto detect it as a Radeon 7900 with the caution sign, AND/OR a r9 200 series card, also with the caution sign, most times BOTH. Then it randomly freezes the system. A couple of times I got it to run long enough just by luck I guess, to install the 13.12 drivers on the 3rd GPU, but then in a few minuites the system would freeze again.
2 GPU's run just fine, the problem only shows up when the 3rd card is installed, and it happens on any of the 4 16x slots, or both 1x slots on the MOBO. It doesn't matter if the GPU's are directly on the MOBO, or on 1x16 risers. There are plenty of folks running this set up with up to 6 GPU's, so I know it can be done, but I must be missing something somewhere in the windows set up.
I've got a bit of a strange issue that I haven't had with previous OSes on this PC. I'm running a Dell OptiPlex 380 with Win 8 x64 at the office and I have to run 2 video cards for all of my screens. I have a GeForce 8400GS in the PCI-E slot and a GeForce 6200 in the PCI slot.
Previous to the Win 8 install, it worked fine. Booted exactly as I had shut it off. Now, with 8 - it's killing my PCI-E card presumably when it puts the screens to sleep when I leave for the day. So I'm coming in to all of my windows being moved to the PCI card's monitor (only one hooked to it) and I have to reboot to get PCI-E card to come back.
If I check the screen properties, I see only the screen hooked to the PCI card. I've changed the setting in the BIOS for the first video card to Auto and also tried the PCI-E selection and it's the same either way.
Why on earth would it default to the PCI card rather than the PCI-E card?
i am on windows 8.i am using HP laptop.this is my ccc version.
i got 2 problems:well first one is that i can't update to windows 8.1 as the update always hang on 82% and after that my laptop automatically shutsdown i don't know whats the reason is and that's why i am stuck at windows 8 till now although have 8.1 ISO but i don't want to install it as i have to erase my my windows 8 which came pres-installed so i will loose my genuine windows 8 .now i want to know what if i open the 8.1 installer and install it from a virtual tool like Deamon although it gives an option to keep files app setting safe but i fear that it would conflict with 8 and i would have to format my laptop then ok now to the major problem i just recently purchased Watch Dogs and installed it.I played the first mission and got busted so the mission failed and it began to reload and then a error pops up saying watch dogs have stopped working upon googling i found out that this is due to DUAl AMD Graphics so they stated that i should disable it but i cant find any option of disabling DUAL Graphics.
I've got an old 750 GB HDD (SATA) which will be fine for file serving (multi-media). It's got the standard SATA connectors (both the small one (data) and the power connector but it's ALSO got the 4 pin power connectors that you get with older IDE drives (Molex ??)
Do I need to connect BOTH power sockets or just ignore the older 4-pin IDE type connector.
Not sure why they needed BOTH power connectors -- maybe this drive might have been used also for making external HDD's at one time.
I'm minded just to connect the two SATA cables and ignore the other one.
i play a browser game on chrome named begone nplay with my charger plugged in i get 100-130 fps but whenever i unplugged it it drops to like 20-40 fps . i checked power settings and its always on high performance .
I just decided to turn loose Windows 8 RP on "bare metal" here as opposed to a VM. I didn't even bother to mess with video under a VM, but with the bare metal install, I've run into some trouble.
At first, media player didn't seem to play anything, including .wmv files. Messing around, I unchecked the "Direct X acceleration" box and that allowed it to play .wmv files. However, it still won't play .mp4 files -- no video, just sound, and Media Player will hang trying to exit.
It did this with a clean install with only the Windows supplied video driver. My video card has been obsoleted by AMD now (HD 4200 chipset) and I couldn't use the new Windows 8 preview driver package, but I did try installing the AMD 12.4 driver in Win7 compatibility mode. That worked fine it seems, but it still won't play .mp4 files. Turning the Direct X acceleration back on still kills .wmv playback as well.
Win7, both x64 and 32-bit played .mp4 files fine right out of the box on this machine.
I have the following graphics card Gigabyte GeForce GTX 650 Ti Windforce 2X NVIDIA Graphics Card - 2GB installed on the system in my sig, what are the correct BIOS setting for the GPU, The options are: PCI Slot, PEG, PEG1, PEG2 and PEG3
I'm having issues with one of my GPU's fans (it features a dual-fan cooling) and will have to fix it soon. The problem is, it's making an annoying noise and I was thinking if I could just block it for the time being, however brutally that sounds.
So, the question is - provided the GPU doesn't reach critical temperatures (I'm going to monitor this obviously) - would this create any additional risks, like overcurrent or something? (I don't play games, but it's a HD6950, so it does emit a lot of heat).
I actually think it's integrated, how do I do it? According to running a random game on systemrequirementslab.com (Dead Island), my dedicated video RAM is 32mb. It's not a custom built, I bought it from the store.
Here are specs according to dxdiag: System Info: Operating System: Windows 8 64-bit (6.2, Build 9200) System Manufacturer: Dell Inc. System Model: Inspiron 5323 BIOS: A08 Processor: Intel(R) Core(TM) i3-3217U CPU @ 1.80GHz (4 CPUs), ~1.8GHz Memory: 6144MB RAM Page file: 3252MB used, 5310MB available DirectX Version: DirectX 11
I am on my work computer, a DELL desktop with Windows 8.1, and I've been using a dual monitor setup for months, both with Windows 8 and 8.1. One monitor is connected via the VGA input, and the other is through a VGA to HDMI adapter. Again, they've been working flawlessly for months. It was a plug 'n play setup that took a minute and never had an issue.
I came in today and the second screen was blank. In the screen setup menu it recognized it and allowed me to adjust between extending, mirroring, etc. But after I unplugged it and plugged it back in, it no longer recognizes it. It simply cannot find a second monitor plugged in. Both screens work when plugged directly into the VGA port separately but not when plugged in the way they've worked for months.
I just purchased a new windows 8 PC. I was attempting to hook the computer up to my tv through hdmi (the pc has no VGA input) and when I went through the normal paces to put the display on my tv (display desktop only on 2), the signal did not transfer and there is no display on either my tv or pc! how to revert the display back to my pc monitor without being able to see anything? Acer support was less than useful. It's an acer aspire z3-605 all-in-one pc.
I've noticed that whenever I unplug my charger my brightness dims down to a point where I know its on, but I can't use it. Usually I have to put my computer to sleep and turn it back on to get it to fix. Also, the brightness wont adjust in anyway when its on. I've tried the function keys and the setting in the side bar. I have a toshiba laptop running windows 8.1 . I also have recently completely wiped it thinking it was a virus of some sorta with no change.
My Sapphire AMD HD 7870 just bit the dust so it has been sent back with an RMA. I had hoped to replace it with an ASUS AMD HD 3850 but Windows 8.1 will not work with it. There is an error message on a blue screen something about a header (I don't have the exact error message today).
I suspected it was a driver problem so uninstalled the existing driver and tried to install the latest AMD one. I can uninstall OK but as soon as I reboot the Microsoft driver installs itself giving me no opportunity to install the AMD one.
Today, I got a new graphics card for my computer that is about 5 years old. Here are my specs:
Dell Inspiron 530 Intel Core 2 Duo e2500 2.33 GHz Processor NEW GPU: MSI Geforce GT 440 (N440GT) OLD GPU: NVIDIA Geforce 8600GT 3GB ram 300 watt power supply (unsure of 12V ratings) Windoes 8 Pro 64bit
I installed my GT440 into my computer today, and the card worked fine when there was no drivers installed. I uninstalled my older drivers from my 8600 and installed drivers for the 440. When I restarted, I got the BSOD with an error message saying VIDEO_DTR_FALIURE. What does this message mean, and what can I do to fix this BSOD problem? This didnt happen on my old card, but it does on my new card? Is there a problem with my power supply, or is it an issue with the drivers?
I just bought an Nvidia 620 card today. I run both Linux and Windows on my system. When I go to Linux, everything work fine. But in Windows 8, the GPU fan start to scream. Its too noisy! I even updated the drivers.
I recently updated to Windows 8.1 (64-bit), and I immediately noticed that my resolution had taken a strange turn. Everything except the taskbar seems to be a lower resolution than before. This suspicion was only strengthened when I launched a game that I had running in 1440x900 windowed, and it didn't fit properly on the screen. I checked my display settings and used printscreen to check the resolution and both said 1920x1080, but when I used Puush to take a desktop screenshot, it came out as 1536x864, whereas before installing 8.1, desktop screenshots came out as 1920x1080. I tried changing the resolution and changing it back, but I'm still having the issue.
I have 64MB for video memory reported in BIOS but my Adapter shows only 32MB, it's normal and I am looking at two different things or something isn't right, I have the latest Intel HD4000 drivers installed.
So I've been trying to update my graphics card. Before I ever got the notification for the update and tried to install it, I could play minecraft, and star craft, and watch Netflix video. When I did install, and was given the failed due to I/O error box, I couldn't play any games or watch any Netflix. Don't know what to do to fix this. Since, I've gone to my manufacturers website and re-installed my driver, but I think the real problem is seeing how I cant connect it. It always says it failed due to the I/O error, so I cant connect it to the video card or something.
My computer is a Toshiba. other info:
Model: Satellite L875DProcessor: AMD A6-4400M APU with Radeon(TM) HD Graphics 2.70 GHzPlenty of RAM (only 5 month old computer)
In Device Manager under Display Adapters It says Microsoft Basic Display Adapter. I just need to know how to change it. I've gone in through device manager, pressed update driver, chose to do it manually and choose a signed driver. When I choose My AMD Radeon Drive and it attempts to install, it always says it failed due to an I/O error. This has been going on for a week or so, and I thought I could fix it, but nothing works.
I had opened up GeForce Experience and automatically updated my graphic drivers.The updater crashed while updating.Tried to update again and it gave me the message saying my driver was up to date no need to update.Opened up Steam and tried to play a few games and they all crashed. I also noticed games outside steam were very sluggish.I checked device manager and only my onboard video card was shown. My high performance card was no longer listed under display.
So I uninstalled GeForce experience and the driver then reinstalled an earlier driver version (I could just install the previous version and it should 'roll back').Same problem, my games would crash and things were slow.Removed those drivers and reinstalled earlier ones.Rinse and repeat removing and installing different version drivers 3 times. After being advised not to use GeForce Experience to update my drivers and to go to the manufacturer website I did so. I removed my current video driver and installed the one I downloaded from manufacturer.
Throughout this whole event certain drivers make my video card not appear in device manager and require me to install a different driver on it to get the card to display. Everything worked 100% fine until GeForce Experience crashed on me.
Currently; my video card is shown in device manager the driver seems to be installed and my games do in fact work. The problem is the FPS rate is extremely low. Before this issue I was having around 240 FPS inside certain steam games which I'm now getting 80. Before I updated the graphics driver my entire laptop worked extremely well and I've yet to have any issues. With that being said; I believe that if I could completely uninstall the graphic driver and reinstall the one I had before all this mess occurred things would be back to norm.
How to figure out what version I used to have and I'm not so sure I'm uninstalling the driver completely. (I'm in device manager just selecting uninstall) I'm willing to roll back my laptop a few days but I do not have restore point created. Maybe the driver is not installed correctly?
A different way to uninstall the driver or maybe a way to freshly install the video card, I don't know. Any simple way to restore my laptop to the way it was when I got it I'm willing to do that as a last resort. Everything I have is backed up.
System Specs: OS. Win 8.1 CPU. i7-4700MQ GPU. GeForce GTX 780m Ram. 16gb Make / Model Dell - Alienware 17.