i just purchased this monitor, and to keep this simple, windows 7 doesnt recognise my monitor as a samsung. it just says generic pnp, but wont let me go above a resolution of 1280 x 720. the native resolution on this monitor is 2048 x 1152. i have a BFG GTX 280 video card so thats no issue there. all my drivers are up to date, and i also tried forcing a driver thats pre loaded with windows, but nothing works. anyone have any ideas?
EDIT: also tried forcing a custom rez through NVIDIA CP- no go.
My monitor: Samsung Syncmaster 933SN won't display at it's native resolution which is 1360 x 768. It will only display at 1024 x 768 which only takes up 2/3rd's of the screen real estate!
I have the most updated driver for the monitor, 3.0.0.0 from Samsung, which says it works on XP, and VISTA 32/64, but for some reason Windows 7 isn't recognizing the native resolution. It was recognized easily on Windows Xp when I bought it a few months ago, so what can I do?
I have a HIS ATI Radeon HD 4770 graphics card and a Samsung SyncMaster 2243NW monitor (native resolution of 1680x1050 @ 60 Hz) and I can't choose resolutions higher than 1600x1200.
I've installed the latest ATI driver but I can't find a Windows 7 driver for my monitor... Any solutions?
I really hope that you can help me. I installed Windows 7 RC and after install my monitor resolution is set to 800x600. If I try to raise it, I get just one more option and when selecting it, I get a black vertical bar on the right of the monitor with still very bad resolution.
My Display Adapter is listed as Intel (R) 82865 G Graphics Controller (Microsoft Corporation XDDM)
and my Monitor is: SyncMaster 943 NW/943 NWX/ NW 1943/NWX 1943.
I don't know whether this is a Windows 7 problem or a Catalyst problem, but here goes...
I have installed Windows 7 (clean) and the latest Catalyst drivers for my ATI HD4850 card. My monitor is a Samsung SyncMaster 2243NW, a 22" panel with a native res of 1680 x 1050. I have installed the monitor driver from Samsung's website and the monitor is now showing correctly in device manager.
When I right click on the desktop and choose "screen resolution", the "change the appearance of your display" screen offers only "display device on DVI" ion the drop down menu, not the Samsung monitor, and does not offer me the native resolution. The nearest is 1440 x 900. If I click "advanced" I'm taken to the catalyst driver screen which correctly identifies the monitor, but bizarrely still offers only the same choices of resolution as the Windows 7 screen. 1680 x 1050 just isn't there.
Can anyone help me get my monitor to run at native res in Windows 7?
Does anyone know where I can download the drivers adn the magic tune software for this monitor Samsung Sync Master T240 LCD? Or do they know if samsung are going to write drivers for windows 7?
I own a Samsung SyncMaster 913v, and I am missing its driver for Windows 7 but , when I check the website for download, it's only available for Windows XP, does it means I won't be able to find a suitable driver for it?
I have an old HP Pavilion t730m desktop pc. It has an onboard Intel 82865 G Graphics Controller, with a 19" Samsung SyncMaster 943NW lcd monitor and 2.0 GB Ram and Pentium 4. Currently it is running on Windows Vista Ultimate without Aero Effects, but everything else is working perfectly.
I would like to upgrade to Windows 7, as a matter of fact I've done it twice now, but whenever I do it I have monitor resolution problems, either I'm stuck with an 800X600 resolution or other options but not the current resolution of 1440x900 that looks great. So I have restored my backup of the Vista installation.
Sometimes when I change the monitor resolution, I get an incomplete screen display and a black bar on the right end of the monitor.
I would really like to upgrade to Windows 7 since my computer seems to run better than with Vista, but the screen resolution is very annoying.
i recently upgraded my OS from winxp sp3 to win7 pro. Before the upgrade my native screen resolution is 1360x768 and after the upgrade, it has changed to 1900x1080, i dont like the change although it gives me bigger space on my desktop, but it seems everything on my screen are smaller than before. I always design in photoshop and when i make banners, its really small i dont see the details. I tried to change the resolution back to 1360x768 but when i do, its not that sharp, kinda blurry.
westinghouse lcm 22w3 monitor my native resolution worked fine on my old comp. got a new comp, with windows 7 64bit. and wont display my native resolution. pretty much all resolutions except mine...and some very close to the correct one. only goes up to 1600x1200 (what im using right now)
how do i get the right resolution? i have an ati radeon hd 4250 integrated graphics card.
its bugging me SOOO bad how im so close to the right resolution, but everything is off ever so slightly. the person i bought the computer from was telling me how i might get the correct resolution if i switch from my current vga cable, to a dvi cable. he said id get my native resolution, and it'd look better.
if there are any drivers i do need to update.. i dont know what ones they are, or where to get them. im used to having xp..this is my first night of having windows 7, and im unfamiliar with every single this on this computer.. so bare with me.
edit: id like to add that after i right click and go to screen resolution, and click update driver, it says its up to date. also, the "display" is being read as "generic non pnp monitor" which is odd, because my monitor IS a pnp.
I've had this card now for ~2 months, and it has worked flawlessly, up until now. Out of nowhere my monitors native res. of 1920x1080 is not an option in the control panel (it says display device generic pnp...or something like that) In Nvidia control panel 1920x1080 is an option however when i choose it and click apply it reverts back to the incorrect resolution (1024x 768 i think it was)the option for display device is something like GTX 580 generic-non PnP...
I'm having trouble getting my monitor to display at its maximum resolution and I cannot for the life of me figure out what's wrong.
I have an ASUS VH242H 23.6" Monitor which has a native resolution of 1920 x 1080. I originally had windows xp and it could display this resolution just fine. Now, after installing windows 7, when I hook the monitor up to my laptop, I can only get it to display up 1400x1200. Going past this point causes the monitor to flas an out of range message. Furthermore, my computer won't let me designate the refresh rate as far as I can tell. I updated the driver for my graphics card and that didn't seem to make a difference.
I have an HP nw8440 Mobile Workstation with a ATI FireGL v5200 GPU.
I got my friends laptop to reinstall Windows, but after doing so I've lost the option to set the resolution to its native 1600x900. I've downloaded all the drivers necessary from the Toshiba website, but still no avail. After installing CCC and running it, it says "There are currently no settings that can be configured using AMD Control Center." I googled a bit and it seems I'm stuck in default VGA mode. When I go to check display adapter properties, it does indeed say VGA adapter. The laptop model is Toshiba Satellite L775D-S7220.
Recently, I've got a Samsung SyncMaster 2043NWX monitor. It's an used one, but it did work. My computer is an Acer Aspire M3920 (desktop), a more complete configuration can be found in my profile.
My desktop PC connects to my main monitor, an Acer P196HQV via an HDMI to DVI connnector, and my desktop PC connects to my second monitor, the SyncMaster, via VGA.
The problem is, however, the second monitor doesn't work. Windows 7 does detect a monitor connected via VGA in Control Panel > Appearance and Personalisation > Display > Adjust Screen Resolution, it says in the display box: Code: 2. Display Device on: VGA. In the Device Manager it only detected my Acer monitor, but after I downloaded the SyncMaster driver from the official site of Samsung, it detects the SyncMaster instead of the Acer monitor! This isn't the case in Adjust Screen Resolution.
The SyncMaster shows nothing at all, the blue LED, where I can tell the monitor is turned on from, flickers all the time, but the screen doesn't show anything.
I have tried reinstalling the driver, restarting my computer, installing the newest updates for Windows 7, making the SyncMaster my main monitor, turning off the Acer monitor, but nothing works.
I had a weird problem with windows 7 logon screen - it displays in a resolution lower than my laptop's native 1920x1200 resolution, even though I have used both logonstudio vista and tweakslogon.exe to change the background file. The file is in 1920x1200, but windows 7 still displays it in a lower resolution. Any fixes? I have done the registry edit as well, still no luck. I have tried naming the file both "backgroundDefault.jpg" and "background1920×1200.jpg" but still no luck..
I have a hp w19 monitor and its native resolution is 1440*900
it worked fine on windows xp and vista on my old rig but in windows 7 the nearest resolution which i got was 1600*1200 which was too big for my monitor to display.
Now my question is that i am building a new rig with intel i7 860 4 gb corsiar ram and a ati radeon hd 5850 i will be plugging my video card into my moniter via a vga cable so will this problem be common and i will also use th 64 bit version of win 7 and earlier i was using the 32 bit.
The color that is displayed on my screen does not print out the same. The yellow which should be bright and is bright on the screen comes out as with an orance tint.
I am trying to play a game I used to play nearly ten years ago. It's a free online game called Continuum/Subspace and there are user created "zones" or variations of the game within the game (if that makes sense). I play in a zone called Trench Wars. It's a top down game, meaning you are looking down at your ship as if from above and the screen scrolls as you move to reveal more of the area (map) you are playing in. You are in a spaceship of your choice with varying weapons for each ship. The game requires aiming at your enemies before firing at them and anticipating their moves.
Well I noticed that playing at a higher resolution allows you to see more of your immediate area and expands your view of the map. It works just like your desktop in essence. The higher the resolution, the smaller the icons, but the more space you have. In the game, the higher the resolution, the smaller the ships are, but you can see more space around you. Obviously, this would give an advantage to anyone playing on a higher resolution, as they would be able to see their enemies and fire before the enemy even knew they were there.
My native resolution for a 32" LCD TV is 1360x768. This is fine except when playing continuum. So I tried changing my resolution to the max which is 1920x1080. Well the screen expanded and cut off parts of my screen, everything got blurry, and had a blue tint to it. I figured this is probably not good for my TVs functionality, as I have burnt out old monitors in the past by running too high of a resolution. So I went to my NVIDIA Display Properties and started messing around. I realized that changing to a higher resolution (1920x1080) from here still caused me to lose part of the screen, but the blue tint wasn't there.
It's still vaguely blurry, but not enough to really notice. So I went to the resizing tab and resized. My screen to fit at the higher resolution. I ended up with resolution of 1842x1036 at 60HTz refresh. I resized my icons and text and the screen is perfect now. But, it is much higher than the native resolution of my monitor. So my question is, will keeping it at this resolution cause my TV to "burn out" or have any otherwise adverse effects?
System Specs that might be important: Display: Geforce GTX 550ti connected through HDMI to 32" LCD TV TV: Panel Resolution - 1366x768 Display Resolution Scan Rates - HDMI 1.3 Suggested Resolutions 1080p, 1080i, 720p, 480p, 480i
Every time I try to run a program (fullscreen) with lower resolution than the monitor (1920x1080) it gives screen waiting ... and only comes out if I press alt tab ... how do I fix this? I called the tech support .. and said that it has seven windows drive yet .. anyone know what do I get?
I have a problem with the monitor driver, i have a samsung syncmaster 940bw i cannot get the driver to be recognized so i can get the native resolution, for you that dont know this monitor, on any other resolution other than the native the screen shows ALOT of blur what makes viewing of whatever impossible, i have tried using the driver that is for all windows but windows7 (x64) discards the driver and uses the pnp one, even with DriverMax the drivers wont be accepted,
i'm currently using the DVI input with powerstrip (on a radeon 3870) and i got it to force the native 1440 * 900 but on games the resolution wont be forced, because windows wont let the drivers do their thing i guess, i have tried using the vga but the screen gets corrupted, i have the beta Windows 7 ati drivers also, i'm very frustrated by this.
banging my head with this issue...i have an ati radeon hd 5850 and samsung syncmaster 2233SW monitor....in the games the resolution being displayed is very low, somewhere in the 1280 x 1024 type of regions...though in the game settings its showing 1920 x 1080 .rivers are latest...11.12 amd vision engine control centerbtw my desktop resolution iv put at 1280 x 800 ...wen i put it at 1920 x 1080 in the desktop d sides of the screen get cut...evn my monitors auto adjustment function doesn
i read that windows 7 has native support for mkv files. yet when i try to open an mkv file in media center it doesnt work. am i missing something? im running Windows 7 pro
The adaptec site states that the drivers for this controller are native in the Windows Vista CD, could this possibly mean that they will be present in the Windows 7 bublic beta build?
If not, I have Vista x64 drivers ready to install.
i have burned 2 movie files onto disks and each time althought the picture is perfect their is no sound at all. but the files will play with sound when played directly by windows media. Both files are .AVI Files.
Why IE9 does not have a native spell-check capability? All other browsers seem to have it. Is there a technical reason that prevents IE from having this? Has MS ever explained why spell-check is not included as a feature of IE (especially in 9 Beta and RC)?