I have a 22" widescreen monitor I have set my resolution to 1280x720 and everything is just the way I want it. Except when I open a browser IE8 for example everything is "crunched" not stretched. I'd rather have it stretched, and I was wondering if anyone knows of a way to fix this.
Well, I using a 32" LCD TV as my primary monitor, it runs well at 1360x768 but I got the problem when a game runs in 1024 x 768 or other 4:3 resolution its not possible to stretch the image cause Windows 7 RC put the game in the middle and there are dark areas to both sides of the centered image. Thats really annoying, I never got this problems with 7000 release. Anyone got some idea about this issue? When its a feature, how to turn it off?
Im using a Nvidia 9800 GTX with the latest Windows 7 Beta Drivers from Nvidia.
i recently reinstalled win 7, and I have a 7600GT, Pentium D 950 @ 3.5GHZ, 4GIGs of RAM, and a normal HDD, (my motherboard is just a generic Acer mobo that I found in a garage sale). Im having this problem where windows is detecting my monitor as widescreen, and is only showing widescreen resolutions like 1024 x 728. My monitors max res is 1280 x 1024. Which is what im used to. Currently, everything looks like crap. Ive tried new and old drivers for the GPU, and windows update is on. I even tried to force the res, it just wont force it though, it cuts off every part of it expect the top left corner.
With widescreen monitors, which are pretty much the standard these days, there is always plenty of horizontal space, it's the vertical space that is at more of a premium. To take advantage of this I've done 2 things - First, I moved my taskbar to the left side of the screen, this gives me that much more vertical space in all programs.
The second thing is program-specific. I use Firefox and I certainly spend more time in it than any other program. I've installed 2 two add-ons, Vertical Bookmarks Toolbar and Compact Menu. They do just what their names imply - the first places my bookmarks toolbar to the left (or right, if you prefer) of the screen, the second compacts my entire Menu toolbar to a small icon that sits in the upper-left corner of the address bar. Getting rid of these 2 horz menu bars gives one a nice little chunk of additional real estate. It takes a bit getting used to, but once you do, it's great.
Have an weird issue with Win 7 in the sense of Widescreen support.
I have an 22inch screen and run my games at the native rez of 1680 X 1050
Now the thing is older games like Project Snowblind and NFS:Most Wanted for example doesent support Widescreen as such, obviously. But in XP it seems to stretch the image and run fullscreen without any black edges on screen.
Where as in Win 7 it runs like in unstreched with Black Edges cause of the rez.
In XP even though not in Widescreen res , NFS for example I run 1280 X 1024 it runs full screen stretched, but in Win 7 it runs with the same rez and settings. But doesen't stretch image to rez.
Anyone else experience the same problem or know of a solution. I didin't do anything strange in XP, or change any settings or tweak GFX card drivers.
I tried running in XP Compatibility mode with Win 7, still the same.
The other day i installed Windows 7 RC x32 on my laptop with a 1680 x 1050 HD LCD Screen, off course it's widescreen. Now hen i run any 4:3 resolution, such as 1024 x 768 the image doesen't fit the hole screen, leaving one bar at each side.
This only happens hen i'm running Windows 7, i have another partition with Windows XP, and hen i run a 4:3 its goes full screen without the bars.
Alsow i didn't had my Nvidia drivers installed on that partition, i hazen't on my home so i didn't had internet, and i didn't had the drivers neigther.
I deleted that partition with WIndows 7 alredy, i would change from XP to 7 but i would need to format, and i wanted to keep my data. Alsow installing all my games (about 90 gb) on another partition with Windows 7 would be ridiculus.
I would change from xp to 7 if anyone can tell me a fix to this please.
I've been with windows 7. It's fast, reliable, stable (even RC have been for me) and everything have been working A+ up until now.
This morning when I spent some time on computer before leaving to work, everything worked as they've worked so far. Now when I got back from work some 8-9 hours later, something was terribly wrong. My 22" LCD isn't recognized anymore and thus native resolution 1680*1050 isn't an option anymore. I really don't know what happened. I'm bit confused how it, what ever it is, happened.
My gfx is Geforce 8800GTS.
I've tried to update my Drivers, rebooting, fiddle with NVIDIA Control panel, nothing.
Does win 7 require a 'newer' widescreen monitor for best visual? Everything is so small on screen. And the offered alternative configs. all look distorted. My flatscreen monitor is a 15" screen (33cm x 26cm or 13" x 10").
For some reason I don't have any widescreen options for screen resolution. The black bars on the sides of my screen are getting very annoying. Am I missing something here?
I just installed Windows 7 and installed the latest Nvidia drivers. I can run at standard resolutions but when I try to switch to native wide screen resolutions, my monitor indicates the "input is not supported".
Widescreen resolutions work fine in Safe mode or when the Nvidia drivers are not installed. I have tried using older versions of the drivers but it seems nothing seems to be working.
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
i just got a dell optiplex 755 running windows 7 and i tried hooking it up to my tv to use as a monitor and it starts up and goes through untill it says starting windows (where its supposed to take me to log in screen) and then my tv just says not support but if i start in safe mode it goes through i just dont understand why it doesnt work i even tried lowering the resolution then hooked it up again and same thing its driving me crazy why it wont work i have a 32" HCT tv
oh also im using VGA to hook up since computer doesnt have hdmi out
Computer was slow and seemed to have too many things running all the time, freezing, etc. I ran Malwarebytes and Webroot Secure Anywhere; they didn't find any problems.So restored it to a week earlier then everything ran just fine for a few days. Now it seems to be doing it again. I went to Task Manager and found several items I can not clearly identify after searching online for them:Monitor.exe *32?
This may seem like an elementary question, but is it possible to run 2 PC's on 1 monitor (seperately of course)? If so, how would it be done since there is only one hookup to the monitor? Some sort of splitter?
Im using my gtx on my samsung led monitor And some times I use it on my 32'' led as htpc or sometimes play games on it My question is will the card heat up from this setting ? I only use one monitor and other one is turned off but the cables remain plugged
I use my TV as my desktop through an HDMI cable from TV to my Tower but because of lag is there any better way to set it up i've tried VGA that didn't make a difference so please could k of lag i know it's the tv as i tried laptop to tv would there be any settings on tv to stop this?
I really love native resolution, but you can only go up to 1024x768 with a VGA monitor. Does anyone know where I could get a DVI monitor for, say, under $150 thats 15" or above?
When I connect my laptop to my LCD TV I can see windows 7 starting to load on the TV screen. As soon as Windows has installed the TV is not visible and windows can't detect it. Also some icons dissapear off my laptop screen.