The problem is that for some reason I cannot get a pixel perfect screen to the Sony Bravia Full HD LCD TV with the Radeon 4890. There are always black borders around the screen.
Have the Catalyst 9.11 Suite for Windows 7 (64 bit) installed.
With a Nvidia 8800gt there is no need for any kind of settings, the picture is pixel perfect HD 1080p straight away.
Any idea how to get the resolution / settings correct?
I'm trying to connect my IBM SL510 laptop to my SONY Bravia TV. I've seen online that an S cable is required. I thought I could use an HDMI cable. Also, I don't see an S cable port on my laptop.
Since this is my first gaming PC, I know basically nothing about how long a graphics card will last before needing an upgrade to keep playing newer games. I'm getting a bit worried with the 6000 series already out, and I'm still using a 4000 series. I have the money, I just do not want to buy a new card until it will actually do me some good.
I have windows 7 7600 x64 and using the newest official Catalyst drivers from ATI for the Radeon 4890 1GB video card I own.
However I have noticed some little issues in games. Two games Im currently playing Trine, and Mass Effect show what I call a tear (its not anything missing) but when your running around you see a visible line move its way up the screen like your watching a fast refresh on PC-Anywhere or something.
It does not effect the game at all, or slow it down and its not 100% of the time its when your running or moving side to side etc. Always while in motion.
I thought maybe it was a vsync issue so Checked that in the graphic settings for Mass Effect but no difference. Does anyone know what Im referring to if there is a proper name and how I can go about reducing it or eliminating it in games?
In Trine it does it as well when IM running and jumping around (side scroller).
I booted Windows 7 a few minutes ago, and saw what made me think something might have happened relating to my graphics card. Right before the log-on screen came up, I saw a lot of flickering/static, which I thought was really odd because the PC is connected to the monitor with a digital signal (DVI). The computer froze, at this point and I had to reboot. I'm posting from the same PC, so needless to say the reboot worked.
I'm attempting to set up a Home Media System for my muli-handicapped son who's passion in life is movies. This approach should greatly simplify the end-to-end process for him. OK, now for the problem and where I can use your help.Here's my attempted set-upI'm ripping his DVD movies to MP4 using WinX-DVD Premium Ripper Storing the movies on an external 2TB harddrive connected to my DELL desktop running Windows7 Ultimate, 64-bit Using Windows Media Centre / Media Browser to stream & display the movies Running the movies thru a new Xbox-360 (using MC extender), connected to Xbox is connected to Sony Bravia XBR-55 via HDMI (HDMI 2) Here's the problem I'm encounteringI'm ripping all the DVD's using the exact same format Some of the movies are displaying "full screen" while some of them are only displayed on 2/3's of the screen - right in the middle.These are Full Screen vs. Widescreen movies. The movies I'm experiencing this problem with work fine ifI run them from the DVD player (i.e. Google TV - these are NOT Blu-ray DVDs) I stream them from the computer, via WMP straight to the Sony TV - this approach is supported, but with an antiquated interface..
my system configuration is 64bit, i3processor but i m install windows 7 ultimate 32bit.my problem is graphics card is not working so what i can do. i have already installed graphics card driver yet it does not work.graphics card is ati radeoon premium graphics.
Just wondering if am the only one with this issue? There is no ATI Overdrive tab in my CCC, it's missing. Is there anyone else here with ATI 4890 and Windows 7 x64 that can confirm this? So i know if it's a local problem or hopefully a driver problem.
I bought my VGX-XL301 in August 2007 and it cam installed with Vista home addition and upgraded to Windows 7 Professional in April 2010. I have the computer linked to my Samsung HDTV via an HDMI connection.
Everything worked fine up until recently.
I am working on the computer and then switch HD inputs to watch TV. After a time the compuiter goes into sleep mode. The next time I want to use the computer, I press a key to bring it out of sleep mode. I switch it from the TV HD input to the PC HD input but no picture comes up.
I have to force a shut down by holding in the power button and then turn on. The PC then goes through the boot up and the PC picture appears, however sometimes it takes 2 or 3 forced shutdowns to make it work.
I have upgraded the driver on the graphics card as I thought that might be the problem.
I have a Diamond ATI HD 5670 graphics card installed, and when ever I try to connect my Toshiba LCD HDTV 40E220U to the computer it comes back saying "No Video Signal", I tried what the manual said for the HDTV. But it still wont recognize my HDTV, what should I do?
I just finished a new machine, EVGA VIDIA GeForce 9800 GTX+ card. The card has two DVI outlets - 1 S Video outlet. I'm using a Sony Bravia 32" XBR for a monitor.
Using a DVI to HDMI adapter gives me a not so good picture.
Using a DVI to VGA adapter and connecting to the TV in the "PC IN" area, give me a better picture, but still nothing all that great.
Using my Dell laptop, with NVIDIA GeForce 9400M G video and an HDMI outlet gives me a great picture when I go HDMI to HDMI.
I know I am not happy with what I have, I don't know what to do next.
My 1st thought, based on the laptop experience, is to get a video card with an HDMI outlet.
My 2nd thought is to try the S Video connection, which I would if I had a cable.
My 3rd thought was to come here and ask advice from people who probably can come up with more than 3 thoughts at a time.
I've been trying to connect my laptop to a HDTV. I have the required (I think) cable, it connects well to both laptop and TV, I can see the image on the TV screen, but the image on the TV is small and centered. It's probably twice as small as the screen itself. I want to stretch it, but I don't know how. I'm really not sure that the problem is the laptop and not the TV itself, though.I don't know what specifics to include. I can add anything that might be required The TV is 1080p, maximum resolution on the laptop is 720p. I tried connecting the cable to all 4 HD slots in the TV and some give different results. Some are more stretched than others, but not full screen. I tried changing the TV option
Does anyone use there 1080p HDTV as a monitor? If so do you think that the font's, icon's and the rest of the display make things apear to small although you have more room on screen etc?.
Hey I'm running Win7 64 bit. My laptop has the nvidia 330m. I have a standard HDMI cable hooked up to my HDTV. I can get picture perfectly, but no audio. I do not see any kind of option to do so.
I recently fixed and reinstalled my OS. Well once I did, my sound decided to simply stop working. I have updated all my drivers, Set as default device, checked sound manager, and even used headphones. Nothing. I have both a PC-Monitor cable and a HDMI. Neither one works. Even though I hear no sound, Realtek HDA still shows audio going on.
i want to know can i set 1080p HDTV resolution on my Monitor ? theres a option to add 1080p24 format as display but when i select it i get a message 'forcing display mode tht exceeds the reported capablities of its EDID may result in a blank screen or permanently damage the display' currently i am using 720p HDTV resolution
I am using my Sony Bravia LCD PC jack for the my pc's display(for the moment until I get a longer HDMI cable). The screen is severly shifted and I cannot adjust it by just uing the hortizal control, it is still out of range. Is there another way to adjust it via windows drivers to minize the shift(like in linux)? Any help would be great.
Oh BTW, I also am using the latest ati driver 9.10. I had the same problem with the windows driver.
I have recently bought a HDTV from Philips 32" and it is announced as 1080p.
Now, when i tried connecting it to my computer, it says that 1366x768 is the native resolution, and not 1920x1080. I have even tried creating a custom resolution as 1080p to see if it works , but no luck.
So, how do i get this TV to work on 1080p?
Or better, how do i check to see if it really is 1080p?
i need to know the best settings and best player for quality. toshiba 40g300u 40inch tv hooked up to my acer computer with hdmi port. the acer has a geforce 7100 with latest graphics drivers. its just i dont understand if i need to run 16-255 or 0-255 and how to adjust the settings in the nvidia control panel.
I would like to stream HDTV content from my pc to the media center on my xbox360.technically it's no big deal. however, i cannot reach the respective streaming performance necessary for this type of content.my setup:router: d'link 524 (100mbit) (tried wireless & cable)pc Windows 7 64bit connected to router via powerlie 85mbit adapterxbox360 connected to router either wireless or through cable to router with this setup windows media center tells me that i don not have enough bandwith for hdtv content - i barely reach sd tv standard.is there an option that the router settings throttle the performance?
I just built a new gaming rig with 8gigs of RAM. I made the mistake of installing WinXP Pro x32 on it. I find out that 32 bit OS's can't access 8 gigs of RAM, so I went and bought an OEM Win7 64 bit disk. Come to find out now, Win7 x64 doesn't play well with many HDTV's. Something about they changed the instruction in Win7 to not allow for overriding something called the EDID signal from the HDTV. So now I'm screwed again. Now I got 8gigs of RAM but no video. Great. What now?
I need to go and buy a third OS that will work this time? WinXP Pro x64? Yeah, it's not the video card, the video card drivers, or the TV. It all worked just fine with WinXP Pro x32 installed. And all the cables are plugged in tightly. Win7 x64 will output video signal to a generic VGA monitor. If I set the resolution to like 1024 x 768 or something then swap the VGA cable from the VGA monitor to the HDTV I get video signal. As soon as I reboot, no video signal. I get video for the post screen and the Win7 startup screen.
As soon as it trys to boot into the desktop, no video signal. My AKAI TV only says "Not Support". Win7 will not get past the HDTV EDID handshake. Win7 sucks. Now I find myself looking at the Windows web site HDTV "compatibility list" for Win7. Like trying to work my life around bill gates and his crappy software. Serving Windows instead of it serving me. When will those people ever get it right again after WinXP? Any workaround that will make Win7 override the HDTV EDID handshake?
After connecting VGA to VGA connection for lap top with windows 7 to HDTV and scrolling to PC designation on HDTV set, a message comes up "no device connected/no signal".
Can you tell me what setting on Windows control panel I should use? . . . or what other issues must be checked.
I just bought a Lenovo v570 laptop with vga and hdmi output and I cannot get the hdmi to output to my tv. The tv is a vizio 47" 3d LCD and the laptop is running windows 7 home 64bit. I have tried windows graphics drivers, lenovo drivers and intel drivers. To me there is not much difference between them but who knows. The laptop will output to tv using vga and it will connect to an LG 42 inch LCD at my work using hdmi, just not to the one I have at home. I have tried everything that I can think of as far as settings go and on my tv as well. Nothing changes whatever I do. Does anyone know of any hardware compatibility between some vizio tvs and windows 7? I have seen one on the compatibility website for windows but it was not my tv. Mine is not listed. The model number for my tv is E3D470VX. Not sure what else to try. Next I guess I will call Vizio to see if they can work out the problem. Lenovo support doesn't know.