I set the monitor resolution to 16800 x 1050 (recommened) with smaller text and it works fine. But when the PC is woken up from standby the resolution is now 800 x 600 with larger text, which is not what I wanted. How do I stop this happening, please? This is a new machine running Windows 7 home premium 64 bit with 22" Edge TFT monitor.
while playin games like red faction guerrilla or far cry 2 suddenly the screen goes blank and then new resolution takes effect then again after some time it gets hanged
Win 7-64 on Gateway FX6860 i7 system 8gm ram tons of HD space. Got "Page Failed to Load" error attempting to change my Windows 7-64bit display resolution. This used to work, but no more ( don't know what might have caused it). I ran "sfc /scannow". SFC reported that there were corrupted files that could not be repaired or replaced and to look at CBS.log. I did but can make no sense out of it.
Sometimes when UAC pops up my screen resolution changes from the native 1680x1050 resolution to 1024x768. There doesn't seem to be any reason for the change. Is anybody else having this problem or is it just me?
I have my screen resolution set to 1920x1080 and it is the right size for me. It keeps changing on me. I right click on my desktop and then click that i want to keep it at this point but it still changes to 800x 600. I took my computer in to get fixed but they couldn't so they put a new hard drive in it free if that makes any difference. I have windows 7. I have internet 8.
my LG w2243s worked fine on my computer before i upgraded to windows 7 from XP proffessional. now it doesn't recognise it. and wont let me adjust the resolution to the monitors settings, i tried updating the driver, but was told i already have the best driver available. my monitor is not listed in DEVICE MANAGER and when i download the driver from LG, it does nothing when i click it..
I have an ATI HD 6950 with 2 dvi monitors and 1 hdmi tv. Whenever I press the Win Key + P and choose "Computer Only" the picture goes to the hdmi output which is not what I want. I've tried messing with Catalyst trying to change the identities for the monitors but Computer Only option always goes to HDMI because it's identified as number 1. When I choose "Projector Only" the picture goes back to my 2 dvi monitors extended. How do I change this?
I want when I select Computer Only to only have picture on one of my DVI monitors and Projector Only on the TV Monitor.
just installed onto a new machine windows7 home premium the monitor reports the vga signal as out of range (ie the resolution is too high) i have tried leaving the monitor disconnected until the machine has been on for 10 mins( same result) i have tried to get it into safe mode and it says "setup cannot continue in safe mode computer will restart now" so basically I am stuck I have gone into cmos on the mother board and reduced the on boardgraphics memmory to 64mb in an attempt to drop the resolution I do not have any other monitors available (within ten miles)?
I've a Core i3 processor on Intel DH55TC Mother board with HD graphics. The system is new and I've installed Windows 7 with drivers provided with Intel Motherboard. But once I installed the graphics driver my monitor shows lot of dots on screen. I've LG 16" Screen LCD Monitor. If I uninstall the graphic driver, the dots will go but the correct resolution for my monitor is not available.
I really hope that you can help me. I installed Windows 7 RC and after install my monitor resolution is set to 800x600. If I try to raise it, I get just one more option and when selecting it, I get a black vertical bar on the right of the monitor with still very bad resolution.
My Display Adapter is listed as Intel (R) 82865 G Graphics Controller (Microsoft Corporation XDDM)
and my Monitor is: SyncMaster 943 NW/943 NWX/ NW 1943/NWX 1943.
I hope you can help me find a solution to this problem.
Grandma's computer - trying to troubleshoot from Chicago for a computer located in California. About to kill myself. Please help.
Graphics card on computer is an integrated Intel G33/G31 Express Chipset Family adapter. I purchased a VGA splitter so that she can connect it to her LCD monitor and Plasma TV.
The monitor's native resolution is 1680 x 1050. HDTV is 1280 x 768.
My grandma complained that the resolution was messed up (I later learned that it was set to 800 x 600). I asked her to set the resolution to 1680 x 1050, but when she does, she gets the message that "the current input timing is not supported by the monitor display." I checked the refresh rate -- it was set to 60hz. None of the other 1.6 ratio resolutions would even show up on the available list of resolutions.
I had her plug the LCD monitor directly into the computer (sans splitter). No error message. The resolution stuck right away.
So, how do I use this splitter and still get 1680 x 1050 on her LCD? I don't even want to know what will happen to the TV's resolution when we do this, but I can pray.
So I recently just hooked up a 2nd monitor to my computer just to play with it. Unfortunately, it will not go above 1024x768. I would like to set it to 1280x1024. The monitor is a Digimate DGL20, it's an LCD that was made back in 2005. The screen's max resolution might be set to 1024x768, but if that's so its there anyway i can set it to 1280x1024?
I did a clean install of Windows 7 from XP. Now I cannot get the resolution to change on my monitor. I have tried updating drivers but it still won't work.
My computer is a Dell Optiplex 160L and the monitor is a Proview Pro 458.
I recently updated my nVidia drivers to the latest and also my SiSandra Lite software (for quick access to accurate specs of my PC). After updating SiSandra Lite, I looked at the GPU section. As usual it loaded the primary monitor first where everything was fine. I then changed to look at my secondary and my primary monitor turned off and my desktop was shifted to my second.Now when I boot my computer, my primary comes on, but after the pre and post bios. Only when Grub comes on does something appear on my primary monitor, but it's very low resolution (compared to what it used to be). The windows 7 loading bar appears on my primary but before the logo can appear and glow, it swaps to my secondary monitor where it stays.
Windows 7 cannot detect my primary monitor trying both screen resolution detection and it's missing from the devices menu. Really confused about this whole business.I realise this may have been posted as a problem before. I did find little snippets of information about it but nothing sufficient.
Do you remember Windows 95? You could set your desktop resolution bigger than the resolution of your monitor (it could have a 1024x768 desktop with a VGA monitor - 640x480). Now we all have monitors with a resolution of almost 1280x1024 (mine is 1680 x 1050). We have twice as much space in size, but almost five times as number of opened programs. Regardless of many windows programs (like the ones I use, and I think many others), which, of course, engage, are contextual, but reduce the space available for the job.I already throw some programs that allow you to manage more virtual desktop, but they are able to separate programs, not to give you more space.
I already throw some programs (like gimespace) that they simulate the desktop oversize shifting all visible windows into a direction, but if a context switch off a window, I shift the desktop and the context reshow that window, it will appair in the same phisical location.But I like exactly that Windows 7 makes exactly what Win95 did.
Anyway, this is my issue. Whenever I turn on my monitor, I find my resolution gets changed to 1280x1024 instead of my native 1440x900. This happened with the latest nVidia driver, the older nVidia drivers, and the built-in Microsoft driver. I couldn't get my native resolution back unless I rebooted the machine entirely.
This bug has gotten under my skin to the point where I went back to using Vista, which I didn't want since despite that issue, Windows 7 was a treat to use. Does anybody know why this keeps happening? My video card is an eVGA GeForce 9800GTX+ and my monitor is a Gateway 19 inch FPD1975WH if that helps.
I don't know whether this is a Windows 7 problem or a Catalyst problem, but here goes...
I have installed Windows 7 (clean) and the latest Catalyst drivers for my ATI HD4850 card. My monitor is a Samsung SyncMaster 2243NW, a 22" panel with a native res of 1680 x 1050. I have installed the monitor driver from Samsung's website and the monitor is now showing correctly in device manager.
When I right click on the desktop and choose "screen resolution", the "change the appearance of your display" screen offers only "display device on DVI" ion the drop down menu, not the Samsung monitor, and does not offer me the native resolution. The nearest is 1440 x 900. If I click "advanced" I'm taken to the catalyst driver screen which correctly identifies the monitor, but bizarrely still offers only the same choices of resolution as the Windows 7 screen. 1680 x 1050 just isn't there.
Can anyone help me get my monitor to run at native res in Windows 7?
I saw someone had posted earlier about what I think is the same issue, but the page disappeared. My TV is not going to the resolution that I set on the computer. It is going to 1920x1080 which looks awful since it is not its native setting. I wanted it set to 1360x768 which looks the best. Does anyone know why my TV is doing this? I use to be able to do it but had trouble recently when I showed my friend today.
what a big ass step up in performance going from vista 32 bit to 7 64 bit, wow!
So anyway ive managed to get everthing else working on my computer but am having a problem setting the native resolution of my LG Flatron L204WS monitor. I've had a search in this part of the forums and couldnt find anyone with a similar problem so i hope im not posting another unecessary post!
Graphic card: 8800 GTX
Driver version: Windows update, the newest Nvidia Drivers for windows 7
Monitor cable: VGA, as this is all it supports
The problem: When trying to set the native resolution of my monitor to 1680x1050 im getting the message from my monitor that it is out of range, i think the setting for refresh rate is at 59hz and wont change to 60. I have located some LG drivers for the monitor and installed those and now windows recognizes my monitor in the advanced settings but not in the direct window of screen resolution. Could this be why? Or could it be related to the fact that the resolution atm 1400x1050 works with 59Hz but 1680x1050 does not work with 59Hz?
Ive seen a few other posts with people stuck on 59 and not being able to change to 60 but there has been no real solution found as of yet. I think i have listed all the relevant information needed for a bit of troubleshooting only if you were so kind to have a look and have a ponder
O yes there was one other very strange oddity, when i run a game such as Team Fortress 2, i can set my resolution to 1680x1050 and it runs fine, how very odd! Thats where this problem stumps me even more!
Can't get monitor to display same resolution between 2 workspaces - Microsoft Community Copy/paste of text in OP (posted Nov 21, 2012): Ok, so here's my setup:HP EliteBook 6930p with ATI Radeon HD 3450 graphics card. Win 7 Pro 32 bit installed.I have 2 workspaces, both with docking stations.Workspace 1 (primary work site) had 24" wide (primary monitor) and 19" wide monitors.Resolutions that Windows let me set were 1920x1080 and 1440x900, respectively. These monitors were also hooked to the docking station via DVI and VGA cables, respectively.Workspace 2 had a 19" standard monitor (1280x1024, VGA, set as primary) and then I would open the laptop beside it for the dual monitor (1280x800).Today I have pulled the 19" wide from workspace 1 to replace the 19" standard at workspace 2. Windows, however, will only let me set the resolution to 1280x720 (still using VGA). Even taking off the "hide modes" option or going into the ATI Catalyst Control center to try and force it does not work. I've also tried switching between which one is primary or even making it single monitor display (on the 19" wide). 1440x900 is simply not available.
suddenly i can't use the old monitor resolution. I was using 1440 x 900, now it doesn't let me anymore; when i 'forced it' from nvidia control panel it appeard with two black bars on the sides and foggy text. I think is a problem with the monitor recognition, because it doesn't appear my name monitor, it appears Generic Non-PnP Monitor.I mention that i already reinstall the windows and graphics driver.
Windows 7 installed without any problems, and found all the hardware devices, but the only problem I have is that the monitor will only dislplay at 1920x1080. If you lower it , the monitor will not display a full screen. Tried all resolutions, and have the latest 191 drivers. Any ideas..
I've got Windows 7 Ultimate (x64), my graphics card is an ATi 4670HD and my widescreen monitor is a 22inch Samsung SyncMaster 225MW.
Everything looks and works perfect. When I go to display, I click detect, and ofcourse it finds my monitor and sets it to its native resolutiob (1680*1050.)
But when I restart my computer, it will be set back to 1024*768 and it will be listed as the generic monitor.
The latest ATi drivers are installed on a fresh copy of Windows 7 and there were no hitches at all. Also to add, I've got 2 seperate DVI cables used to make sure the cable wasn't at fault and they both give me the same result.
What is the problem? This is one I can't quite get my head around.
I'm having trouble getting my monitor to display at its maximum resolution and I cannot for the life of me figure out what's wrong.
I have an ASUS VH242H 23.6" Monitor which has a native resolution of 1920 x 1080. I originally had windows xp and it could display this resolution just fine. Now, after installing windows 7, when I hook the monitor up to my laptop, I can only get it to display up 1400x1200. Going past this point causes the monitor to flas an out of range message. Furthermore, my computer won't let me designate the refresh rate as far as I can tell. I updated the driver for my graphics card and that didn't seem to make a difference.
I have an HP nw8440 Mobile Workstation with a ATI FireGL v5200 GPU.
I have a new computer and monitor. It worked perfectly and the monitor was installed emmediately without me doing anything. So it all worked perfectly - until one day my SSD broke. So i had to send the SSD back - and got a new one. Great, I thought, now it will all work perfectly again. But, as I have finished everything and be happily in my account, my monitor wont let me set the 1920x1080 setting...its not even listed. OK, I thought, then I will install the driver. So I installed it from the CD delivered together with the monitor when i got it, but theres still no 1920x180 - setting listed. I tried nearly every driver download for my monitor and searched nearly the whole internet for solutions, but it was all the same. as highest the highest two are 1856x1392 and 1920x1440 - which looks horrible. So, the best resolution is 1400x1050 - which is just too wide. My monitor is the Samsung SyncMaster S22A300 LED-Monitor with, as said, 1920x1080 Shall i contact the management?
I start Windows after the machine being off for a while, maybe a day or two, the primary monitor setting changes to my secondary display. As I do actually have two monitors on the video desk, it is fairly easy (for me) to change the setting to what it needs to be. But I am not the only operator that uses this computer, and most of the volunteers know little when it comes to computers of any kind. Now you would think that someone with my technical background would have been able to fix this issue, and I can...using a 3rd party solution that costs money.
edit: to clarify, I have been having this issue for few years, but I have been the only person using the system. And this was happening on previous systems as well
Specs of the system in question: Custom Built Desktop; Windows 7 Professional x64;