I have a hp w1907 19 inch color monitor. It comes on and stays on a few sec. and pops and goes off. After a few seconds it comes back on but it continually does this but gets slower each time of coming back on. I have hooked another monitor on to the pc and it works fine but it does not have the speakers built in but I can use some other speakers and they work fine. I have windows 7 on that pc. I have went to hp and downloaded the most updated drivers but that didn't make any difference either.
I have a new problem. My monitor has begun to blink repeatedly after coming out of power saving mode (when the screen turns off automatically due to inactivity).
I have no problems with this in that past. This problem just started out of the blue a few days ago. Initially, it would just blink once or twice coming out of power saving mode, now it might blink 10 or more times before the monitor stays on. Once the monitor is on, it functions fine. The blinking only occurs after coming out of power saving mode.
I just installed Windows 7 Ultimate to my new PC (was XP before) and I noticed that the HDD light blinks every second even if the PC's idle. Is that a normal behavior for Windows 7?
I cannot seem to add a printer. The menu comes up with two displays; Add a device and Add a printer. The Add a device works but the Add a printer just blinks and returns to the screen.
I used to have a program that makes this change but I can't figure out what its name was. I re-installed Windows 7 on my machine and forgot to save its installation file.It's a plain, all gray and squarish program, with a menu at the left and the corresponding options at the right pane. I has a lot of cool intuitive menus for everything one wants to change in Windows 7. I thought it was a TweakUI from Microsoft, but it looks like there's no such thing.I just tried a program called Ultimate Windows Tweaker but that's a different one that doesn't have the option written at the title, and is also less organized IMO.
Since yesterday I've noticed that whenever I access a page with grey and white colors (Hotmail, Messenger, Latest news, Sport, Music, Movies, Money and Cars from MSN UK this one is the only one so far) some lines appear and they blink, this does not happen in white or black backgrounds, I've tried removing video drivers (when these were uninstalled there were no lines, but when I installed them back again the lines came back too ), doing a flash player clean install and restarted in safe mode (when in safe mode the lines did not appear) ,My GPU is a RADEON HD 6850, and I've tried with an old driver and with a new one but no ; I don't have any other PCs so I cannot try the monitor in another one unfortunately?
The only way I can get my computer to start is to cut, then restore power.Then I get the beep, and the Windows introduction graphics start showing up.Then the machine abruptly shuts down and the green led blinks 3 sec on, 3 sec off. The next time I cut and restore power,I may get a screen that offers to "restore". That option may or may not work.If it does work, the computer finally boots and acts normally.Or I might get a screen giving me the option to repair or start normally.Choosing to start normally usually works.What do I need to do?
I have a lenovo pc 9210 all diags pass all hardware is good. was running xp fine loaded vista ok loaded windows 7 32 bit runs ok, needs windows 7 64 bit...loaded windows 7 64, reboots after load begins first time setup then completes as normalreboots ....and hard drive light blinks .... then starts to start windows 7 but gets the four colored dots then just sits there no more hd lights no , no nothing.
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
i just got a dell optiplex 755 running windows 7 and i tried hooking it up to my tv to use as a monitor and it starts up and goes through untill it says starting windows (where its supposed to take me to log in screen) and then my tv just says not support but if i start in safe mode it goes through i just dont understand why it doesnt work i even tried lowering the resolution then hooked it up again and same thing its driving me crazy why it wont work i have a 32" HCT tv
oh also im using VGA to hook up since computer doesnt have hdmi out
Computer was slow and seemed to have too many things running all the time, freezing, etc. I ran Malwarebytes and Webroot Secure Anywhere; they didn't find any problems.So restored it to a week earlier then everything ran just fine for a few days. Now it seems to be doing it again. I went to Task Manager and found several items I can not clearly identify after searching online for them:Monitor.exe *32?
This may seem like an elementary question, but is it possible to run 2 PC's on 1 monitor (seperately of course)? If so, how would it be done since there is only one hookup to the monitor? Some sort of splitter?
Im using my gtx on my samsung led monitor And some times I use it on my 32'' led as htpc or sometimes play games on it My question is will the card heat up from this setting ? I only use one monitor and other one is turned off but the cables remain plugged
I use my TV as my desktop through an HDMI cable from TV to my Tower but because of lag is there any better way to set it up i've tried VGA that didn't make a difference so please could k of lag i know it's the tv as i tried laptop to tv would there be any settings on tv to stop this?
I really love native resolution, but you can only go up to 1024x768 with a VGA monitor. Does anyone know where I could get a DVI monitor for, say, under $150 thats 15" or above?
When I connect my laptop to my LCD TV I can see windows 7 starting to load on the TV screen. As soon as Windows has installed the TV is not visible and windows can't detect it. Also some icons dissapear off my laptop screen.
Seems like whenever I use this specific monitor this BSOD occurs or maybeit's cause I have internet. Well, I got this issue after I built my computer and borrowed my friend's monitor and got internet, but whenever I don't use this monitor or this specific internet I do not come across this problem.
I just completed the build of a new system and have the Windows 7 RTM software loaded on the system. Since the system is new I just have it connected to an old tube monitor which isn't so great. I was looking at getting a new 23 inch LCD that can do 1920 x 1080 but I'm wondering if I'm going to need a specific driver for the monitor or whether Windows 7 will see it automatically as plug and play?
So far one monitor I'm looking at only has a driver for 32-bit Vista and my machine is 64 bit. The other monitors didn't have any drivers so I'm worried about buying something now. Should I be concerned?
when I upgraded from vista premium 64 to to windows 7 (same) my 23" inch monitor developed a 1" wide black boarder there by reducing the entire size of the monitor how do I remove it, I hope I am in the right place with my question.