I need help because my monitor shows up as a acer al1714 and the resolution can only go up to 1280x1024 but i have a v7 22" monitor every thing is horrible.
Right now i have to monitors both 22" and i also have 2 evga 9600 gso cards running on SLI i don't know whats wrong i just turned it on and it was all resize and this was after i updated my nvidia drivers.
Today i installed windows 7 professional 64-bit on my acer 5920g laptop. My problem is that at device manager, the monitor driver is generic pnp monitor and i can't install the right driver for my laptop.
I visited the acer website and i didn't find any monitor driver. Is there anything else i could do?
I wanted a break from the HDMI sound through the monitor to use headphones, and after several other methods, I tried disabling the HDMI sound. That worked great, but now that option for playback device is completely gone! How do I get it back so I can use the (admittedly weak, but still functional) monitor sound?
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
I have a Compaq Presario CQ56 64-Bit (Windows 7 Home Premium) laptop that I bought before the summer, today a Windows Update has given me an Acer display update published on the 28/12/2011. I thought because I had my vga cable plugged into the TV Windows Update may have become confused but despite checking for updates again it still appears there as optional.
AMD/ATI is recognized by Windows and Windows Update and there had already been a display driver update not so long ago, so I cannnot understand why I'm receiving an update for the wrong machine?
When i turn the computer on i get the screen : setup is starting service. Install Windows. The computer restarted unexpectedly or encountered an unexpected error. Windows installation cannot proceed. To install windows, click ok to restart the computer, and then restart the installation. I have contacted acer they tried to help with the ALT and F12. Nothing I have tried works.. They told me to order the recovery discs for $20.00. I did , and they dont work either... I contacted them back, and now they say to send it back to them and for $199.00 plus shipping, they will repair the computer..I do not have the money for that..I have the the system disc, the 3 recovery disc and the language disc , that I just ordered from Acer for $20.00.
When I press down f10 and ALT at the same time during boot up from the factory recovery disk when seeing the splash screen, nothing happens and the 'acer laptap just boots from the regular hard drive as usual.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
i just got a dell optiplex 755 running windows 7 and i tried hooking it up to my tv to use as a monitor and it starts up and goes through untill it says starting windows (where its supposed to take me to log in screen) and then my tv just says not support but if i start in safe mode it goes through i just dont understand why it doesnt work i even tried lowering the resolution then hooked it up again and same thing its driving me crazy why it wont work i have a 32" HCT tv
oh also im using VGA to hook up since computer doesnt have hdmi out
Computer was slow and seemed to have too many things running all the time, freezing, etc. I ran Malwarebytes and Webroot Secure Anywhere; they didn't find any problems.So restored it to a week earlier then everything ran just fine for a few days. Now it seems to be doing it again. I went to Task Manager and found several items I can not clearly identify after searching online for them:Monitor.exe *32?
This may seem like an elementary question, but is it possible to run 2 PC's on 1 monitor (seperately of course)? If so, how would it be done since there is only one hookup to the monitor? Some sort of splitter?
Im using my gtx on my samsung led monitor And some times I use it on my 32'' led as htpc or sometimes play games on it My question is will the card heat up from this setting ? I only use one monitor and other one is turned off but the cables remain plugged
I use my TV as my desktop through an HDMI cable from TV to my Tower but because of lag is there any better way to set it up i've tried VGA that didn't make a difference so please could k of lag i know it's the tv as i tried laptop to tv would there be any settings on tv to stop this?
I really love native resolution, but you can only go up to 1024x768 with a VGA monitor. Does anyone know where I could get a DVI monitor for, say, under $150 thats 15" or above?
When I connect my laptop to my LCD TV I can see windows 7 starting to load on the TV screen. As soon as Windows has installed the TV is not visible and windows can't detect it. Also some icons dissapear off my laptop screen.
Seems like whenever I use this specific monitor this BSOD occurs or maybeit's cause I have internet. Well, I got this issue after I built my computer and borrowed my friend's monitor and got internet, but whenever I don't use this monitor or this specific internet I do not come across this problem.
I just completed the build of a new system and have the Windows 7 RTM software loaded on the system. Since the system is new I just have it connected to an old tube monitor which isn't so great. I was looking at getting a new 23 inch LCD that can do 1920 x 1080 but I'm wondering if I'm going to need a specific driver for the monitor or whether Windows 7 will see it automatically as plug and play?
So far one monitor I'm looking at only has a driver for 32-bit Vista and my machine is 64 bit. The other monitors didn't have any drivers so I'm worried about buying something now. Should I be concerned?