I have my power settings to turn off my monitor after 5 mins.. It has always worked before, but now my monitor just goes black (still backlit) and not off like it used to... This is on a Dell M6400 notebook..
I have a white box tower with a ASUS P5KPL-CM Motherboard I turn it on and I get a black screen, no signal to the monitor,so I plugged in my old Sony tower and the monitor works. So it's not the monitor or the cable( switched them a few times by now) I believe it is the new tower. I think it might be the VGA card but know little about these things. Manual says it is a Intel Graphics Media Accelerator (Intel GMA 3100).Or could it be something else? By the way I have Windows XP if that makes a difference.
The thread title about sums it up! When I try to wake my computer from sleep mode, my monitor gets no signal/is black. The "rest" of the computer seems to wake up just fine.
The monitor was working fine with Windows Server 2008, after sleep mode, but I just recently upgraded to windows 7.
I have already tried: displayswitch /extend displayswitch /internal in a .bat file
using "Hibernate Trigger", "Power Triggers" and Windows task sheduler.
Doesn't work at all for me (I've got Windows 7 Proffesional 32-bit Service Pack 1)
Basically, I have two user accounts on the system. When User1 hits "Windows Key+L" to lock the machine and User2 comes along and clicks "switch user", the monitor will flicker to black and say "Entering power save", while the monitor's power light switches from green to amber. After 5-10 seconds, the monitor will come back on (light goes back to green) and present the main login screen that lists the user accounts.
User2 then selects their account, types in their password and attempts to log into Windows, and again, monitor flickers to black, says "Entering power save" and stays black for a few seconds, only to re-appear and show the user's desktop. Nothing is affected, but it's just a real nuisance during what's supposed to be "fast" user switching.
This monitor flickering happens on every user switch, regardless of who initiates it. Both users also have identical graphical setups.
Graphics card is an ATI X1900XTX and monitor is a Dell 2007FP. I've tried running the built-in Windows WDM ATI driver as well as the ATI Catalyst 9.1 driver but it didn't make a difference.
I have a home built Rig. Running Windows 7 RC 7100. Very happy with it. I just had a DVI/HDMI adapter, 15ft HDMI 1.3 cable and a 20ft coax cable delivered. MONOprice.com for the win on those(less than $20 delivered in less than a week)
Anywayz, I set up the cables and all to my Toshiba 37" LCD 37HL67 and the coax to my reciever. audio is working. And even the dual monitor set-up thru Windows 7 works great. However, I am having a "desktop fitting the FULL screen" problem.
It is showing black barz top, bottom, left, and right. Not fitting the complete screen. My TV res is 1366x768. which i have win 7 set to. it seems to be the fullest on the entire screen. But not completely. It does not matter if i make it the main mont or not. Wont change. My GPU is a ASUS 4830 from Newegg(another winner ).
I do have the latest drivers and even tried to play with the CCC and started seeing problems becasue i was doing a trial and error sesh and i stopped quickly. I am not PC inept but I am not the most savy creature either. This one is beyond me at this point.
my monitor turns to black like its going to sleep and then when i check my computer, its still working but freezes after the blackout. I've tried going to control panel and change the power options but it still crashes.It happens when i am searching the net, playing a game,or starting my computer. Any suggestions?? My system is Windows 7 Home Basic, Processor: Intel (R) Core (TM)2 Quad CPU Q8400 @ 2.66 GHz 2.67 GHz, Installed memory (RAM): 2.00 GB and it is a 32-bit Operating system.
I loaded Windows 7 three computers, 2 desktops and 1 laptop. My Dell XPS is the one that keeps flashing a black screen from time to time. It seems like it is thinking for a few seconds but when this happens it is like my monitor turns off.
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
i just got a dell optiplex 755 running windows 7 and i tried hooking it up to my tv to use as a monitor and it starts up and goes through untill it says starting windows (where its supposed to take me to log in screen) and then my tv just says not support but if i start in safe mode it goes through i just dont understand why it doesnt work i even tried lowering the resolution then hooked it up again and same thing its driving me crazy why it wont work i have a 32" HCT tv
oh also im using VGA to hook up since computer doesnt have hdmi out
Computer was slow and seemed to have too many things running all the time, freezing, etc. I ran Malwarebytes and Webroot Secure Anywhere; they didn't find any problems.So restored it to a week earlier then everything ran just fine for a few days. Now it seems to be doing it again. I went to Task Manager and found several items I can not clearly identify after searching online for them:Monitor.exe *32?
This may seem like an elementary question, but is it possible to run 2 PC's on 1 monitor (seperately of course)? If so, how would it be done since there is only one hookup to the monitor? Some sort of splitter?
Im using my gtx on my samsung led monitor And some times I use it on my 32'' led as htpc or sometimes play games on it My question is will the card heat up from this setting ? I only use one monitor and other one is turned off but the cables remain plugged
I use my TV as my desktop through an HDMI cable from TV to my Tower but because of lag is there any better way to set it up i've tried VGA that didn't make a difference so please could k of lag i know it's the tv as i tried laptop to tv would there be any settings on tv to stop this?
I really love native resolution, but you can only go up to 1024x768 with a VGA monitor. Does anyone know where I could get a DVI monitor for, say, under $150 thats 15" or above?
When I connect my laptop to my LCD TV I can see windows 7 starting to load on the TV screen. As soon as Windows has installed the TV is not visible and windows can't detect it. Also some icons dissapear off my laptop screen.
Seems like whenever I use this specific monitor this BSOD occurs or maybeit's cause I have internet. Well, I got this issue after I built my computer and borrowed my friend's monitor and got internet, but whenever I don't use this monitor or this specific internet I do not come across this problem.
I just completed the build of a new system and have the Windows 7 RTM software loaded on the system. Since the system is new I just have it connected to an old tube monitor which isn't so great. I was looking at getting a new 23 inch LCD that can do 1920 x 1080 but I'm wondering if I'm going to need a specific driver for the monitor or whether Windows 7 will see it automatically as plug and play?
So far one monitor I'm looking at only has a driver for 32-bit Vista and my machine is 64 bit. The other monitors didn't have any drivers so I'm worried about buying something now. Should I be concerned?