I am running Windows 7 Ultimate on a Macbook Pro, and I would like to use the laptop screen as well as my ASUS 23" lcd monitor. When I open the display > screen resolution window, it shows that I have only one display, and gives me no options for multiple displays. The same image is displayed on both screens. My 2nd monitor is hooked up through DVI, and worked just fine with Windows XP. Does anyone know what my problem might be?
suddenly my win 7 ultimate won't recognize viewsonic vg2230wm LCD monitor. Installation of driver doesn't help. It always show non-PnP monitor and max 1024x758 resolution which should be 1680x1050. Hardware id shows just default monitor. However win 7 recognize my other monitor correctly. I'm out of options.
I am running a Geforce GT 240 in a HP Pen 4 quad core with 1 Terabit hard drive and a 19 in LCD HD Emmerson TV / PC Monitor and i can't get the sesolutions to match with out having to resize the desktop to a smaller resolution. And all my resolution are not available.
So I went on a three day trip. When I got home, I turned on my computer. The cooling unit made an unusual noise, and all of a sudden the monitor turned off, as well as the computer. Now, whenever I turn on the computer, it works perfectly fine (The cooling unit noise is gone also) but the monitor just goes to the ASUS boot logo and tells me VGA no signal. Even that only happens sometimes, it will mostly just remain in sleep mode. Things I have done:
- Tried with a different monitor - Tried with DVI cable
I wish I could tell you my graphics card but I just don't remember.
I have had a dual monitor setup with my laptop(Inspiron 1545 with GM45 Intel Card) extending to a regular LCD screen. It worked for two months and until recently has stopped working.I have uninstalled, reinstalled, uninstalled, reinstalled the newest drivers, changed the cables(power supply and VGA), system restored back a month to when it was working, and changed the power outlet.This originally worked flawlessly and now the laptop is not recognizing any device I attach via the VGA output. I have plugged the laptop into another monitor with no avail as well. I find it hard to believe that this would stop working one day with no notice or any problems. I have also tried attaching a projector that had worked for over a year with the laptop and have gotten nothing.
I receive a black screen as if to say this something being outputted, but nothing is displayed. Both windows screen resolution and the intel GUI both do not recognize another device. [code]
So I just got a new gateway dx4860-ub32p tower. The problem I am having is that when trying to use a second monitor for extended desktop the hdmi is not recognized. When I start the computer both monitors have the boot screen and starting windows screen. Past that only my monitor plugged in by vga cable is recognized. When I go to graphics options there is only one screen option shown. I am using windows seven the second monitor is a flat screen tv.
I have a three monitor set up. Whenever I open up a browser, either internet explorer or google chrome on monitors 2 and 3, after a few seconds it automatically drags my browser into my main monitor(monitor 1, where my start button and status bar is located)
I have a setup with 6 monitors. Today my main monitor broke and I couldn't find a way to shut the PC down since I couldn't access the main desktop to get to the Start menu.Is there a way to change which monitor is the main monitor without having access to the current main monitor?
I am running a dual monitor setup with my TV hooked up via HDMI to my laptop.
Everything was working fine before, I always had the laptop as my main display yet I could play a video or a game fullscreen on the TV.
Since a couple of days, as soon as I put something in fullscreen, it immediately goes on my laptop, regardless if it's open on my TV or my laptop. I don't remember changing/installing anything that could've changed that...
I checked a bit in the AMD vision software and the Windows control pannel but I can't seem to solve my problem without switching my TV as the main display. I also made a quick search on google as well as here but the problems were mainly with Flash, which isn't in cause here.
Here are my system specs:
Toshiba Satellite L75D Windows 7 home premium 64-bit AMD Radeon 6520G
I have just did a clean install of Windows 7 ultimate from XP and I am unable to get my dual monitors to work like they did in XP.
I have a DVI Radeon graphics card plugged into the agp slot, and a Nvidia geforce vga adapter plugged into a pci slot of my dell optiplex 755. When I am in the screen resolution settings the second monitor cannot be detected.
In bios I have 2 options under primary video adapter and they are auto, and onboard card. When set to auto the Nvidia card gets a signal but the Radeon does not. There are no conflicts in the device manager, and all of the drivers are up to date.
When I change the bios option to onboard card the Radeon adapter gets a signal but the other monitor cannot be detected and in device manager there is a yellow exclamation mark next to Standard VGA adapter and a code 10 error that states the device could not be started.
I have powered down, and unplugged every cable, I also tried to use the integrated VGA adapter to the Intel G31/G33/Q33/Q35 Graphics Controller but the computer will not even boot. I get
Attention Unsupported Video Configuration Detected"
I have two monitors, both work fine as standalone but Windows will not detect either as a secondary.
Please help me someone, I am so used to having my helpdesk email open in one monitor and all of my other work in the other monitor.
This computer is running an ATI Radeon HD 2400 Pro. It does support dual monitors...(I've had dual monitors active on this card before, but never on Windows 7.) But since installing windows 7, I can't even get it to detect the second monitor. I want to run the setup with the CRT as the primary monitor and the HD as the secondary.
I recently had an older HP Pavilion Media Center m7760n Desktop PC rebuilt. The old power supply fried the motherboard so I need to get a new power supply and motherboard. Here are my current specs.
Mainboard : Asus P5QPL-VM EPU Chipset : Intel G41 Processor : Intel Core 2 Duo E6420 @ 2133 MHz Physical Memory : 2048 MB (2 x 1024 DDR2-SDRAM ) Video Card : Intel(R) G41 Express Chipset (Microsoft Corporation - WDDM 1.1) Hard Disk : WDC (1000 GB)
As you can see from above, the "video card" is actually integrated into the motherboard.The computer works perfectly except for one major problem. I have 2 monitors, one is 22" with 1680 x 1050 resolution and the other is a 15" monitor with 1024 x 768 resolution. At the back of my computer I have a VGA port and a DVI port. The 15" is connected to the VGA port and the 22" is connected to the DVI port.When first starting the computer, the 15" monitor was recognized as the primary monitor while the 22" was recognized as the secondary monitor. No problem. I simply went to the display settings and set the 22" to be the primary monitor and the 15" to be the secondary monitor. Unfortunately, this setting seems to reset as soon as I reboot the computer. The 15" is always set as the primary monitor on start up, forcing me to set the proper settings all over again. What's worse is that even after I have set the proper settings, they sometimes revert back when using Media Center or other programs. Worse yet, sometimes the monitors both go completely black ... as if the monitor settings were about to switch but got locked up some how.I'm assuming that perhaps the on board video has a primary port (VGA) and a secondary port (DVI) but even still, shouldn't Windows 7 be able to over-ride this and save these settings so that the monitor settings remain the same during startup and regular usage?
I'm using a television (32p) as a second monitor in extended mode so that I can watch a movie in the TV and play a game in the monitor (this was my main goal). Monitor and TV lays in two different ambients, both connected to the same pc, one by normal VGA cable and the other by HDMI. I managed to differentiate the audio output so that VLC player sends it's audio to the HDMI (so that only the TV plays it) and the rest of system sounds, media players and games outputs to the speakers (basically only the VLC audio is directed to another output device). I reached my goal so that I can watch a movie fullscreen in the TV and play a game in the monitor fullscreen too without any interference from one another (nor audio or video).
The thing is, because I have the TV in another ambient I can't actually see what's going on in it, as I just "throw" the VLC window to the TV from my main monitor. And here's the question: There's a way to see the TV's desktop in my monitor? Without having to set it as main monitor so to not really switch between desktops.. The perfect thing would be if I could see the TV's desktop in a window like in remote desktops applications.
I have AT&T DSL and it just drops out, I have a 2 wire router and the DSL and Internet lights will flash red when it drops, I have had this problem for almost a year and AT&T will run a useless test and tell me everything is fine. I have searched for 3 days trying to find a Broad Band Monitor to let me know when it drops and for how long, also how many times while I am at work, or just not on the PC.
We are running a small network with a NAS and networked PC's. The PC's are all running Windows 7 Ultimate. One PC will open files already open by others and does not notify that the file is in use. This user enters information into the file, saves and closes.When the original user (accessing the file first, sometimes adds information) saves and closes the file, the changes from the second user are lost.
I have a pretty old Compaq Pressario that was running 2 512 MB sticks so I added 2 1 GB sticks today but it still shows it only utilizing 1 MB. I put the new sticks in the original slots and moved the old ones over so it is not an issue of installationI ran CPU-Z and all four sticks show up in the SPD tab but the 2 new ones only have values in the timings table under JEDEC #1, the 2 old ones have JEDEC #1, 2, and 3.
i want to ask. My windows cannot recognize my CD-R files. i put in my CD-R and windows recognize it as blank CD-R. while there's file (an autorun files and software installer) in that CD-R. i have pc and laptop, i try putting my cd-r on my pc (win xp sp2) my windows recognize the file, but when i try putting on my laptop (win 7 64bit) it recognized as blank cd-r.
I was gfiven some external hardrives with enclosures and was told they all work. I cant get a single one to be recognized in computer. any tips or workarounds to get windows to recognize these drives. I am now using an acer 64 bit laptop.
I want to install a HP printer HP laserjet 1020 on a PC with Windows 7 32, When I plug it, I receive a message like USB device not recognize. I did try the USB port with a USB memory stick, and the USB port is functioning. This printer had worked few time with the PC. Now, never connect. I uninstall the printer driver then re-install, same problem I did try the printer in another system and it work Is it possible the the USB port is no sourcing enough current for the printer?
does it recognize it as general how much ram, or does it go specific. sorry if its a confusing question..but i have a gig ram on this laptop and i have some used for video ram..so total is a little under 1 gig, how will windows 7 recognize this, as "your pc has 1 gig ram" or "your pc has 9** ram".
I have been pouring over countless threads on issues related to this. Yet none of these solutions seem to be working for me, and I am fairly certain that I am implementing them correctly.Xbox and PC do recognize each other and I can access the libraries on my computer via xbox.I have setup the Windows Media Center on my Xbox.WMC on my computer shows the Xbox is setup as an Extender. It also states that it is ready for use. Every time I try to run WMC on my Xbox it enters the black screen that says Windows Media Center and has the load "bubbles" underneath with the word Contacting... after a period the screen times out and gives me the message:"Connection Failure.The Xbox could not connect to the Windows Media Player PC. Turn the console off and and back on again, and try again."
I have just found to my irritation that iTunes _will_ check CD info such as song names etc. ....however....
Every now and then it fails to find info. It almost seems as if it does not even try. Usually when I insert a CD, it tells me it's checking with Gracenote, and in short order the details pop up. Then suddenly it will just not get details. I have read the instructions and reasons why.
What does not make sense is that in two instances, Disc 2 of a double disc set will register, but disc one will not. This has happened with Physical Graffiti, by Led Zeppelin, and a Best Of by Faith No More. Of the maybe 1/2 doz CDs that have failed so far, the others were single disc. The failures have failed in spite of repeated attempts, restarts and even a reboot. All the failures were pretty popular, and what I thought were far more obscure stuff worked fine.
EDIT: I now have three double CD sets that have had the first disc fail, while the second gets info. So that is over-represented, but not exclusive: that is to say I have had double CD sets that worked, and single CDs that have failed to supply info from the web.
I can't connect to the internet on my desktop pc after I reinstalled windows. I have linksys wrt320n router and I can connect to it only by wi-fi. When I try to install router's setup wizard or network magic - it writes that my computer is not connected to router though it is connected in the same way as it was before. In the LOCAL AREA CONNECTION it is also written that network cable is unplugged...I also tryed to connect just internet cable without router - PC also didn't react. Maybe the problem is that my network adapter (nvidia nforce) is without appropriate drivers? Cause I can't find drivers for nvidia nforce networking contoller for windows 7.