I have a samsung ls23cmzkfvauza or call it snycmaster 2333 or 2333sw, , confusing for me also although the first posted is the model number on the tag, i am having such difficulty locating drivers,as time has passed with all the new updates for the computer the display has become very unstable, can't open a second window of ie, the first window/tab/ie is ok but the second one sort of phases in and out, mostly out, the display becomes mostly greyed out and or phases in and outgames such as wow, especially in areas with snow, or water, or a lot of sky view, fade in and out also, usually out,,,running win7 with a geforce gt610 vid card just put in yesterday thinking it may of been the video going out,
I just completed the build of a new system and have the Windows 7 RTM software loaded on the system. Since the system is new I just have it connected to an old tube monitor which isn't so great. I was looking at getting a new 23 inch LCD that can do 1920 x 1080 but I'm wondering if I'm going to need a specific driver for the monitor or whether Windows 7 will see it automatically as plug and play?
So far one monitor I'm looking at only has a driver for 32-bit Vista and my machine is 64 bit. The other monitors didn't have any drivers so I'm worried about buying something now. Should I be concerned?
I have a ATI Radeon 7500 and i can't let windows 7 Build 7600 find it. It says under display monitor its a Standard VGA graphics adapter. I'm getting really annoyed because i can't play counter-strike...and i installed many different drivers for it but it just doesn't work. and i also wanted the Control Center but idk how as well. I upgraded today to windows 7 from XP...and dumb me didnt backup ANYTHING after a full clean wipe. So im getting really agitated and not patient about it. please help!!
and as well with my monitor display says Generic pnp Monitor and i can't find the drivers for my monitor anywhere! even on the gateway site. I have a 19" Gateway LE HD widescreen.
I have a ViewSonic 19" CRT (Professional Series P95f+)
There are drivers for Vista 64, however 7 won't take them and the monitor I have is not listed when I try to choose it from the list. I tried similar models but still the monitor will only do 85hz at the highest even in low resolutions.
This monitor can do 160hz in 800x600, 120hz in 1024x768 and 100hz in 1280x960, 90hz in 1280x1024 I believe.
I tried using a custom driver that I made in Rivatuner but it only works on Windows XP. I tried the latest version of Rivatuner but the button for Create Driver is not clickable no matter how I set it up, and it doesn't correctly detect the monitor (perhaps because I'm using 2 monitors at once, not sure).
Any suggestions on how I can force windows 7 to use higher refresh rates? This monitor looks like absolute crap when it's not at full refresh....especially next to my nice new TRUE 120hz 22" LCD (ViewSonic VX2265WM).
I got a new video card at work to install and nothing will show up when I plug it in. I can go through the computers VGA and it works fine, but when I go to the video card only a black screen. Any help? Can't download drivers either, only black screen upon boot.
I accidentally deleted all the display drivers off of the computer, the computer restarted and now the computer has just seemed to screw up completely. The computer turns on however I get no signal from the monitor, it's on but nothing on the screen so i cant get into BIOS or anything.
I've tried: - Using different cables - Taking out and putting the CMOS battery in - Using different Monitors
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
"System Halted
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
[code]....
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
Is there any way i can thoroughly monitor an user account in windows 7? What the user has changed,what program has he opened,how much time has it run,when has he closed it,what other program has been he opened,......etc. and present the data in a logged format.
I've installed Windows 7 Ultimate on my MSI PR200 (MS-1221). Intel (R) 965 Express Chipset Family . Current Latest Bios Version 1.5 It works perfectly.However, I want to program my screen to shut down after 2 minutes (either plugged to AC or battery mode). But it does not work. I've installed the drivers available online, at the MSI web site. Yet, the problem is not gone.
I'm just away to get an acer revo delivered, it's for my bedroom tv, it's got linux on it, but it'll get win 7 installed on to it. now, whats the best way to install win 7 on it? would it be better to connect it up to a monitor and install 7 from there, then connect it to the tv? or connect to the tv and install 7? would it recognise the tv straight away as the monitor?
I just built my computer and am having some trouble in terms of my monitor and it's connection with my computer. The computer takes a dvi cable whereas my monitor takes a vga. They both can use hdmi but the problem is that I recently read that hdmi doesn't display an image until windows is booted so I'm currently trying to install windows from the disc without any picture on my monitor. How I can do this. I know I can use a TV or monitor with a vga or get a dvi to vga chord but I kinda want it done?
When I had Windows XP on one of my PC's, I was able to use my 36" Trinitron CRT as a monitor for some old school gaming, watching movies, and TV episodes with a 7pin to Component connection. It was the greatest thing for SD quality stuff, since HD stretches, lags, and gives different color quality. Now with Windows 7 my NVIDIA Driver won't let me do it the way I want to. I could do this nicely with 640x480 and 800x600 with the 60HZ refresh rate.So. I am using a Geforce 9400 GT with driver version 181.20 (i'm sure that the XP driver might have been a lower number).when I connected the cable to the TV, and opened NVIDIA control panel it immediately changed the settings to dualview without asking me. I switched the display to the TV and the only resolution that works is 720x480. 640x480 was not available on the control panel, but it was in the windows settings. In 640x480 it only gives me up to 30hz for interlaced. and when I change it to 800x600 it only offers up to 30hz for interlaced display. The higher ones which is 59 or 60hz are non-interlaced which wont support 480i.
Is there a way going from a monitor 1 (laptop) and monitor 2 (flatscreen) to the TV that's hookup up with Video-S cable without having to shutdown & unplug the monitor2 and restart. When using Vista I could do this using AVI program.
suddenly my win 7 ultimate won't recognize viewsonic vg2230wm LCD monitor. Installation of driver doesn't help. It always show non-PnP monitor and max 1024x758 resolution which should be 1680x1050. Hardware id shows just default monitor. However win 7 recognize my other monitor correctly. I'm out of options.
okay so I am at work and whenever I want to maximize a window to fill the left side of the monitor, it for some reason maximizes and centers in between both the monitors. Does anyone know how to fix this? I know that probably didn't make much sense, but I attached a picture to show what I am talking about. I was trying to maximize this window by dragging clicking it at the top and dragging it all the way to the top to maximize, but it centers in the middle?
My new PC with windows 7 won't detect my old Acer v243w monitor. I was hoping to use dual monitors, with this monitor connected via vga into the motherboard and my other monitor connected by DVI into the graphics card. The graphics card I am using is a Radeon HD6870. It has DVI slots and an HDMI slot but no VGA slot.
Even when the Acer is plugged into the motherboard by itself using VGA it does not get detected then either.
Is it simply a case of getting a lead which will convert my VGA lead from the Acer monitor into a DVI output which I can then plug into the graphics card?
Or perhaps there is a way of enabling VGA support, maybe via the graphics card or something? It seems odd to have a VGA socket in the motherboard if it is not going to work.
I have an ATI HD 6950 with 2 dvi monitors and 1 hdmi tv. Whenever I press the Win Key + P and choose "Computer Only" the picture goes to the hdmi output which is not what I want. I've tried messing with Catalyst trying to change the identities for the monitors but Computer Only option always goes to HDMI because it's identified as number 1. When I choose "Projector Only" the picture goes back to my 2 dvi monitors extended. How do I change this?
I want when I select Computer Only to only have picture on one of my DVI monitors and Projector Only on the TV Monitor.
My problem is that if I connect the computer via DVI to a display it functions normally, but when I hook it up via HDMI to my Flat screen, it doesn't show the splash screen that gives you the option to enter bios. The screen just stays black until the windows welcome screen(win 7 64bit) comes up, then it acts totaly normal from that point. When I try to press the del key during the boot, I believe it is entering the bios, but the screen remains black so I can't see anything. I am guessing that the MB is not sending the signal to the Flat Screen until it loads windows, but can't find out anywhere that confirms that or what to do in order to resolve it.