I'm currently running dual monitors on my one graphics card. My motherboard has a HDMI port. Would it be possible to use that HDMI on my motherboard and get a 3rd monitor? I would like to hook it up to my TV just to watch video files off of my computer and maybe Blurays off my Bluray drive.
Once I turn off my monitor and turn it on again my TFT monitor has problems to get input signals again from the (running) computer.It remains dark with an indicator "no input signal" and I have to reboot the whole computer to get everything working again.I think whenever a monitor is turned on then NORMALLY the monitor sends a "hello I am alive again" signal to the graphic card. The graphic card knows then that it has to send stuff to the monitor. In my case there seems to be no such "Hello" signal from the monitor->graphic card.On the other hand I could simulate by manual keyboard strokes such an signal and force the graphic card to resume sending signals.
i just finished my build and i have no idea why i went for a motherboard with 2 slots for ram it can hold up to 16gb but luckly ebuyer sent me out an extra 4gb of kingston hyperx ram so i have 8gb all together but theres a problem theres 4 sticks ! obviously i wouldnt want to send it back so what do ya think either upgrade my motherboard and smash the extra 4 gb in there or keep to my original one i dont have money to blow atall im really struggling if its something that will really make a difference ill go with it if not then ill just keep to my original and when i come to buy a graphics card i will get a 2gb graphics card to help with gaming?
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
I just bought a new case for my pc (CM Storm Trooper) and in the top front there is 2 USB 2.0 ports and 2 USB 3.0 ports build in (plus some other stuff) so i was wondering if there is a way that i can get the 3.0 ports working on my ASUS M4A77T Motherboard, which i don't think supports USB 3.0? Is there any adapter/PCI Interface that can make these ports useable?
i have recently purchased an HP computer, that had Windows starter edition, i reformated the computer installed windows vista ultimate and then upgraded to windows 7, i have everythimg working fine, i managed to download most of drivers, my main concern is that card reader that i have in my computer doesnt show when i open my computer or computer,
that about normal mode, but the card reader shows in safe mode, thats wierd, and also these are generic drivers from windows 7, as i was unable to locate any drivers for card reader, may be you understand what my concern is, waiting eagerly for a reply. and my motherboard model no is (Foxconn MCP73M01H1).
Cpu i5 or AMD 1090t or whatever. I can pick out a motherboard that will support USB3 and SATA3 and of course I will have the 2 HD for them. I don't know crap about motherboards but when I pick one out to build my new computer what should I be looking at bus speed and so forth because I don't want to pick out a motherboard that will support them but is slow as mud so why bother so I want to do this right and another hundred here or there don't matter but I don't need a dragster just a Corvet. what I might want to take a hard look at on a motherboard?
how to go about moving hard drives to a new mobo. My current setup is running on an LGA 775 socket, and it's got two drives, one of which holds games and the other has Windows 7 ultimate which I installed. I need to know if there would be any issues if I got a better 775 socket motherboard and transferred the hard drives over. Would everything on the secondary drive stay the same as it was and be a simple swap? and would i have to reinstall windows on the primary drive or would I just be able to type in the product key again?
Until a couple of weeks ago I was quite happily running my Nvidia 9400 GT GPU with no issues. Windows update wanted to update as it does every now an then and afterwards I get a black screen?? I switched to my on-board graphics everything is fine - I try the 9400 and nothing........ it gets weirder....... couple of days later I turn the computer on again and the 9400 is working again..... does so for a few days then black screen again I've been out today and bought a Nvidia GT 620 and plugged it into the motherboard but even that isn't showing up in device manager
My PC is not working It turns on for five seconds, turns off for five seconds, then reboots but the screen displays nothing. I have checked the video card on a different PC, checked the HDDs, and the power supply - those are all working fine.I have assumed that the motherboard or CPU have died and need replacing (is there a way to find out for sure?).My main point for this thread: Could anyone please recommend me a really kickass motherboard, CPU, and RAM? I am also hoping to have one dedicated HDD for Mac OSX , so a processor/motherboard that will be compatible with Windows and Mac would be excellent (is this even possible?)have one HDD dedicated Mac OSX so I can use Final Cut Pro and avoid spending $2000+ on a Mac Pro.
I was recommended this processor: Newegg.com - Intel Core i7-2600K Sandy Bridge 3.4GHz (3.8GHz Turbo Boost) 4 x 256KB L2 Cache 8MB L3 Cache LGA 1155 95W Quad-Core Desktop Processor BX80623I72600K
About half a year ago my computer began to not always start. When I power it on, the fans will get power (they are directly connected to the PSU) and spin up.When it does start, LED's on the motherboard light up - fans drop in rpm and the system boots.When it does NOT start, LED's wont light up - fans stay at max rpm and screen keeps black.Keep in mind that when the system is up and running, everything works perfectly.It never crashes.It performes like it should in demanding applications such as games etc.Temperaturs on CPU and GPU are normal.It should not be a problem with the PSU as I have had the problem with different PSU's.One thing to note is that let's say I unplug the power to my additional HDD's the computer starts pretty much every time, but not always (9 out of 10 or something)With both my HDD's connected is when it starts to become more like 50/50 if it's going to start or not.I'm thinking there might be a problem with the motherboard, but I'm not sure since the system runs perfectly once it's up and running. If it was anything major I assume the system would either not boot at all or it would have alot of hardware related crashes.
i have recently updated to windows 7 my cofig is Cpu intel E4400 ,GA945GCMX2 motherboard , 2 gb ram.when i do multi task, it always gives dump error all the time and the display driver also stops working..gigabyte has only lan and audio drivers for windows 7