Dual Monitor With Integrated Intel Card And Nvidia 8500GT
Feb 11, 2012
I just got this my new pc, and i also got a monitor with it. The computer had a integrated intel video card, and an Nvidia 8500GT. Is it possible to plug the secondary monitor into the integrated intel card, and use the Nvidia for my main display?
I have recently purchased Samsung RF511 notebook and I have been trying to run HD video files(mkv) and they do run but it is very laggy and stuttering occurs very often. So I looked a bit into this matter and what I have found is that the VLC player that I am using to play the video files runs of the onboard graphics card and not the nvidia 540M card that is also installed in this notebook.
I have a evga motherboard with an integrated nvidia 8200 graphics card. I am having issues with the latest nvidia 8 series drivers (my monitor will go blank randomly for a second or two).
Should I have installed the nforce series drivers (the driver package that runs the entire motherboard) that are months older because my graphics are integrated or should I stick with the latest 8 series drivers?
I recently installed an Nvidia Quadro CX card under Windows 7 (risky, I know). Although I'm using the latest drivers from Nvidia, I can't get the second monitor to display through the displayport-to-DVI adapter.
I can swap cables all day long and each monitor works fine with the DVI connector; it's just that nothing is detected when plugged into the second output (displayport).
Has anyone else run into this problem with Windows 7 build 7000?
Original install of Windows 7 did not support dual monitors. Symptom was primary display blue and right display black with no way out. Went back to single monitor and installed the same driver I got to work in Vista and it worked great - 158.24_forceware_winvista_32bit_english
I'm on a Sony Vaio laptop with a second monitor connected with a VGA cable. I just switched from Vista to Windows 7 and my second 22" monitor is not being recognized under the resolution screen. It worked perfectly fine on vista just hours prior. I tried hitting the detect button, reconnecting the monitor, and restarting. What am i doing wrong?
It might be my graphics card, im pretty sure i have a nvidia geforce 7200m gt. cant seem to find drivers for it anywhere. My computer keeps telling me all my drivers are up to date, yet my second monitor is still not recognized. this doesn't make sense. Should i just go back to vista?
ok I have no idea on this one and the good news is i can't fix it LOL.
i have an hp m9340f purchased a couple new nvidea graphics cards when I started getting the noisy fan issue .
Installed windows 7 everything worked great but was having problems with windows media center after I went to a 56k connection so i decided to format start over,
Everytime it installs the drivers for my nvidea graphics card my monitor goes black and has no signal. i tried both dif cards. same problem. I am guessing it is the recommended settings but since I only have one monitor I can't change the settings cause.. can't see to do it, and I have no other monitor to hook to to try.
I can start up in safe mode so any idea's how to fix this issue or change it all to 800 x 600 so i can see when it starts up with new drivers .....
Looked around but nothing specific. If there's another thread with the answer feel free to post the link and smack me...
Have a Dell GX620 (desktop: mid-size that requires low profile PCIe graphics card) running Windows 7 Pro 32 bit. Looking for a low profile card that will support dual DVI (expecting a Y cable) with dual monitors and extended desktop with 2 different displays. Dell 22" widescreen and Dell 19" digital.
Can go VGA if needed but would like to go all DVI.
Another thing I'd like is certified Windows 7 drivers vs just compatible or something that I have hack to make it work. I tend to build out a box and let it run a few years with as little tweaking as possible...
i hope to get some advice for updating my video card before I install Windows 7. I ran Microsoft's upgrade advisor and was informed that my video card (ATI X1650) was not up to running Aero. I'd like to have that functionality but it is essential that I be able to have a card that runs dual DVI monitors, one of which, an Eizo ColorEdge is color-calibrated.
I'm not a gamer so that stuff is moot for me but the ability to maintain high resolution color-perfect monitors is critical for Photoshop apps (my main app).
With that background, what video cards do you think I should be considering?
I'm running Windows 7 on an Emachines t2862, which has an Intel Integrated Chipset (845g, "Itel Extreme Graphic", specifically).
I've tried installing the XP drivers using the installation as well as the tutorial floating around the forums for installing XP graphic drivers. Both caused Windows to crash shortly after booting. I had plenty of time to open a few programs, but about 45 seconds after booting, Windows simply freezes (mouse included).
I had to boot into safemode and roll the driver back to the standard VGA driver to fix it.
Satellite A10 Model PSA10A-3V1KK Graphics Driver.Im trying to run win 7 on this relic and so far so good except the graphics card is a freaking nightmare to sort and toshiba are useless in most areas of support, does anyone know where i can find non proprietary drivers for the Intel 82852/855 GM Integrated graphics chipset.
Im interested in setting up a multi monitor setup.My main monitor will be for gaming, and the other will be for web browsing/server admin clients. Could i run one monitor off my GPU and the other off my Core i5's integrated GPU via the built in HDMI connector? or would this cause a confusion with windows? the reason i ask is because i cant see my ATI 6950 handeling both BF3 and another monitor,
SAPPHIRE 100323L Radeon HD 6570 1GB DDR3 PCI Express 2.1 x16 HDCP Ready Video Card?I don't know much about computers internally. I've been researching and unfortunately could not find any of the answers I've been looking for. I currently have the Acer Aspire X1420G running an AMD Athlon II X4 645 Processor 3.10 GHZ and 4GB RAM.. the power supply is 220W. its very weak from what i have read. So the question is, would i be able to upgrade to a SAPPHIRE 100323L Radeon HD 6570 1GB DDR3 PCI Express 2.1 x16 HDCP Ready Video Card Newegg.com - SAPPHIRE 100323L Radeon HD 6570 1GB DDR3 PCI Express 2.1 x16 HDCP Ready Video Card Without having to swap out the PSU? I'm on a budget hence the 450$ computer, so the PSU I'm trying to avoid buying, but I will if i have to. Because if I'm going to be spending the money to upgrade my performance, I want to do it right.
I am running Windows 7 on my old laptop (HP DV 4305US, Intel Celeron with 1.25 GB RAM). Everything is great besides the sound. The integrated sound card has installed, updated automatically, everything looks normal but the computer refuses to give a sound.
I like Windows 7. It is fast even with older machines.
Hope someone can fix the soundMAX issue. Unfortunately, HP does not have a vista driver or so because the DV 4305US was designed for XP.
I was having a problem with dual monitor using 7127 build. But only when I use newest NVIDIA drivers. My tv just flashes a few times, and I can see that fish a couple of times, but then it just goes to like no signal. If I use default drivers which are installed on windows install, then it works fine, but then I dont have NVIDIA control panel and some other features.
My video card is GF 8800GTS 650M 512MB DDR3 DUAL DVI TV PCIE. I would like to know if anyone else is having this problem and If it has been fixed in the newest build, so I could be arsed to do a clean install.
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
I have sent in my computer to get repaired and got it back yestersay. Since then i've been trying to get it to work properly again.I can't play any of the games I used to play before I sent it in to get repaired (for example, Battlefield 3). I can start dota2 but it looks pretty messed up. My computer shows me that I have an nvidia geforce 7025 nforce 630a graphic card.Although it says in the computer description that it SHOULD have a AMD HD 6870 1GB GDDR 5. My screen isn't plugged in the graphic card, since it doesn't recieve anything from there. It is plugged directly to the motherboard.So, from what I understand the nvidia card is the integrated graphic card or am I totally wrong?Anyway, the device manager can only find the nvidia card and I cannot run any game smoothly on this computer.Before I realized this possibility I've tried to download newer drivers for the nvidia card, but it didn't help.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I purchased diablo 3 for my husband & my video card is not supported so he's unable to play, my computer is a HP P624 f-b desktop with a Intel GMA intregrated graphics, I am aware I need to up my power supply also, I have no idea what video card can be used to upgrade so the game is playable, not looking to spend a fortune?
So I see on my C drive folder for Intel and NVIDIA on the same level as Program Files, Program Files x86, Program Data, Users, Windows.Can I move the Intel and NVIDIA folders into one of the Programs folder without messing anything up? Didn't see anything about it on google.
I have heard that maybe some versions of Windows 7, such as Starter or Home Basic, will not permit the automatic switching to a stand alone graphic card (Nvidia or ATI Radeon).
I wish to find out if this is correct and if so, which versions are restricted? I use Photoshop and play HD video files up to 2GB in size, so I want to ensure switching is performed for improved graphical performance. Based on this finding, I can then decide what O/S to buy with a new laptop I'm about to purchase.