I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU
Chipset : Intel G41
Processor : Intel Core 2 Duo E6420 @ 2133 MHz
Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM )
Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1)
Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
First off, I'm using Windows 7 Professional 64-bit, AMD Athlon II X4 620 2.6Ghz, a 2-year old Powercolor Ati Radeon HD4870, an 8-year old Philips 107T CRT monitor, 4GB RAM, 600W power supply. I'm using a VGA-DVI adaptor to connect the monitor to the 4870 DVI port. Now, what's been happening is that sometimes my monitor would go off randomly. Can't reproduce the problem if I try.Another problem which I'm not sure if it's related is that sometimes when I boot up the computer, 8 beeps are heard, and nothing displays on the monitor. I'm able to get it to display by switching off and on the system, sometimes a few times. I just re-installed Windows, got the latest updates for the graphics drivers, and I thought everything would be fine. But my monitor froze when I was surfing, and when I reset the system, the problem happened again.
I am currently using a laptop connected to my tv. The tv monitor is my primary display and the laptop screen is my secondary, they are extended to each other.I am planning to add a third monitor and was thinking of making it touch screen. Would this work? Everything I have read says that if you use a touch screen it has to be the primary display.
I have an hp dv5215us that I have been running off a power cable with no battery for quite some time now, hooked up to a secondary display, a Dell E151FPp. Every now and then I will unhook the secondary display and take the laptop with me to work without issues. Last night I popped my Win 7 install disk while running XP to install, clicked what I needed to, and let the install disk run. Now that Win 7 is installed, I am running into the following two issues:
1) When the secondary monitor is unplugged, the computer will not power on, let alone boot. No blue lights, nothing.
2) When the computer is on and I unplug the secondary display (I learned the hard way), the laptop powers off instantly, as if I unplugged it from the power source.
I looked in device manager under monitors and it only shows one. I am getting mirror screens now, and under the display options, the resolution options available to me look like those for the secondary monitor, not for the laptop monitor. I installed Windows 7 with the secondary plugged in.
I bought a new battery for the laptop online last night. I know it sounds like a hardware issue rather than an OS issue, but the laptop and secondary display didn't move anywhere since it was running without issues in XP, so unless there was some kind of short or something while the installation was taking place, I can't think of why this would be happening.
I have seen a couple dual monitor problems on this forum, but no solutions for my issue. I have Windows 7 32 with dual 4850 cards and CCC installed with Crossfire configured. I am using 2 20" LCD monitors, my primary works fine. Windows detects my second 20" monitor and I have set the resolution and that it should extend my desktop but there is no display.
When windows is loading both monitors display the Windows 7 splash page so I know the signal is getting there. Also I have tried resetting the drivers, uninstalling them, etc. At no point have I been able to get windows to display on both monitors. Any ideas?
I start Windows after the machine being off for a while, maybe a day or two, the primary monitor setting changes to my secondary display. As I do actually have two monitors on the video desk, it is fairly easy (for me) to change the setting to what it needs to be. But I am not the only operator that uses this computer, and most of the volunteers know little when it comes to computers of any kind. Now you would think that someone with my technical background would have been able to fix this issue, and I can...using a 3rd party solution that costs money.
edit: to clarify, I have been having this issue for few years, but I have been the only person using the system. And this was happening on previous systems as well
Specs of the system in question: Custom Built Desktop; Windows 7 Professional x64;
Is there any way in Windows 7 to force a specific full-screen program (like a game) to open on a secondary monitor? Or does full-screen stuff only work on the primary monitor?
I just finished upgrading my CPU, ram, and mobo, and everything went off without a hitch. Except that now rage won't play on my default monitor. Other than the hardware, nothing has changed with the setup. All my other steam games play on the default monitor but Rage. I'm at a loss here. As windows sees them, monitor #1 is my secondary monitor and #2 is my primary. For a video card I have an XFX 6870 Black edition. I updated my drivers and Catalyst Control center to the latest versions, after experiencing crash issues with rage, so my drivers are all current.
Of course, it is easy to get a print screen of my primary monitor, but how do I get one on my secondary? Simply clicking on it's background doesn't work.
I'm using a Sager 3790 notebook with 2 GB of memory, a Centrino 1.7 Ghz and a Mobility Radeon 9700.Installation went perfectly. The card was detected by Windows Update and drivers were isntalled for it. They're from December 2008. Okay, that's fine and dandy.As soon as the drivers install the highest resolution available on the primary display is 640x480. Why? Because Windows won't/can't detect my primary monitor. It shows up as an "Unknown display device."I can attach an external monitor and adjust its resolution all day. I can also remote into the laptop and that works fine.I am nearly 100% confident that I cannot change the display resolution because the monitor won't detect correctly. I can't manually force an install of a monitor or change the "Unknown display device" driver because that section is grayed out in the "Advanced Settings" under the Display Resolution tab.
Some background first. I have three 1920x1080 monitors. The central monitor is Landscape. The two side monitors are portrait, used for working with web browsers, code, communication.I find it easer to deal with large amounts of text in portrait, where I can continuously track a lot of text without tracking back and forth across the entire width of a widescreen monitor.So, I'm trying to set a background. It works fine.It's abstract-ish, so I set the 'stretch' option.That works fine on the central monitor.On the two side monitors, the image is compressed between the left and right sides of the screen, but does not stretch to reach the top or bottom of the screen. The extent that they seem to extend up and down seem to exactly match the dimensions and alignment of the central screen. If this is so, then it would mean that on the portrait monitors, the background is filling a 1080x1080 block in the center of a 1080x1920 display.The only thing that functions properly is 'Tile'. None of the other options seem to produce any change in the image displayed.
I run a dual monitor setup, and I use two user profiles and switch between them throughout the day. I set up both user profiles identically, which is as: In each profile, I run multiple instances of Internet Explorer 9, so I have four or five Internet Explorer windows open with multiple tabs open within each window. I keep two IE9 windows open on my left monitor, and three open on my right monitor. I also have MS Word and MS Excel open on my left monitor, and two notepads open on my right.
When I click on Start and switch user accounts, if I go back into my first user account, my windows will have either moved off screen somewhere, or they will have moved to the first monitor. The behavior is inconsistent in which windows are shifted around, but the windows end up moving none the less, and it ruins my work flow. I spend all my time setting up the windows back to their 'correct' positions.
Switching users to the 2nd user account will cause the same consequences with the windows set up on the 2nd account, thus defeating the purpose of utilizing multiple profiles to make work more efficient. I recently switched to Windows 7 on a new computer. My "old faithful" has Vista 64 does not exhibit the window shifting issue.
I am running a new 2nd gen sandybridge system, with an asus h67 motherboard and an i7-2600. I have 16 gigs of ddr3, and Win 7 64 Home Premium. I have an ATI Radeon HD 6570 running both monitors, with the latest video drivers installed. I've encountered this issue with and without ATI Catalyst suite installed and with/without Ultramon installed.
I don't want to revert back to Vista 64 just because it will be a lot of downtime to install and transfer my profiles again on the new machine, and I'd like to stay on the most current OS regardless, but I may have to switch back to Vista if this is a known issue on 7. We use dual monitor Windows 7 setups at the company I work for (my day job), and Ive verified 3 of my coworkers also have this issue. I also replicated this issue myself on one of the work Win 7 comps.
I recently updated my nVidia drivers to the latest and also my SiSandra Lite software (for quick access to accurate specs of my PC). After updating SiSandra Lite, I looked at the GPU section. As usual it loaded the primary monitor first where everything was fine. I then changed to look at my secondary and my primary monitor turned off and my desktop was shifted to my second.Now when I boot my computer, my primary comes on, but after the pre and post bios. Only when Grub comes on does something appear on my primary monitor, but it's very low resolution (compared to what it used to be). The windows 7 loading bar appears on my primary but before the logo can appear and glow, it swaps to my secondary monitor where it stays.
Windows 7 cannot detect my primary monitor trying both screen resolution detection and it's missing from the devices menu. Really confused about this whole business.I realise this may have been posted as a problem before. I did find little snippets of information about it but nothing sufficient.
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I just got new SSD to replace a secondary regular HD that died. When i plug in the 2 sata cables that the previous HD drive was using and boot up the computer, the new ssd hows up as the main drive and the primary one is not shown in the BIOS so i cant load windows.
When i unplug the cables the computer works as normal again.
The sata power connectors are all links together (i think its some kind of 4 head power connector, its a prebuilt) if that makes any difference.
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
So earlier this year I constructed my first computer, which I sunk about 2k into and made fantastic . But, having no experience, I installed a 32 bit Windows 7 OS, thereby limiting my available RAM. Since I am a gamer, this is a bit of an issue. Now I have a new SSD, and I was wondering if I could simply load a 64 bit OS onto that drive and boot from there, thereby avoiding the reformatting of my current drive.
The primary drive is a 1TB regular hard drive, the secondary is a 64Gig Samsung 830 SSD.
I currently have a standard computer setup with a windows 7 operating system installed. What I want to do is dual boot with Windows 7 as my primary os but using an esata cable connect the hard drive from my laptop to the computer and have this as a secondary boot option. The laptop hard drive has a full install of vista on it. Is this possible, every time I think about doing it I worry about the drivers on the laptop hdd and how those will react to my main computers hardware as obviously the laptop hardrive is setup to look at the laptop hardware.
I recently reformatted my Windows 7 32bit system on my primary HDD (C). When I open My Computer, my secondary HDD (D) appears, but when I try to open it, it shows as empty. However, if I look at the properties of D, it says that only 200gb out of 500gb of free space is available. I booted into a live version of Ubuntu and the secondary HDD appears with all data in it and accessible, so I know there is nothing physically wrong with the drive. I know that I could brute force this problem by making a copy of the drive in Ubuntu, then reformatting it in Win7, but I'm hoping there's an easier solution that I'm just not aware of.
I would like to know on my new build 120 SSD and a 3T HHD so that I can install my games with most of the files are on the HHD and the games startup with the 120 SSD with the OS installed on it?
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
So I just reinstalled windows on my new SSD. I'm trying to load all my music into itunes and redirect folders/downloads/document buttons into files from my main storage HDD (1TB seagate barracuda) and it keeps telling me that I don't have permission to access these files/folders!
So I can keep going in and individually enabiling sharing each folder, but is there some type of universal sharing option?
i just got a dell optiplex 755 running windows 7 and i tried hooking it up to my tv to use as a monitor and it starts up and goes through untill it says starting windows (where its supposed to take me to log in screen) and then my tv just says not support but if i start in safe mode it goes through i just dont understand why it doesnt work i even tried lowering the resolution then hooked it up again and same thing its driving me crazy why it wont work i have a 32" HCT tv
oh also im using VGA to hook up since computer doesnt have hdmi out
Computer was slow and seemed to have too many things running all the time, freezing, etc. I ran Malwarebytes and Webroot Secure Anywhere; they didn't find any problems.So restored it to a week earlier then everything ran just fine for a few days. Now it seems to be doing it again. I went to Task Manager and found several items I can not clearly identify after searching online for them:Monitor.exe *32?
This may seem like an elementary question, but is it possible to run 2 PC's on 1 monitor (seperately of course)? If so, how would it be done since there is only one hookup to the monitor? Some sort of splitter?