There are some basic issues to consider. Wattage must be high enough to run everything, and for a modern machine that often means 500 watts or higher. However, people are buying 500-650 watt power supplies and still having problems, and the reason is that not all wattage is created equal. Amperage is just as important, and arguable more important on modern machines. Your video card especially will have an amp requirement. It should be right on the hardware specs where it will say "Requires a minimum of 28A on the 12V rail" or something similar.
Now, here is the tricky part: many modern power supplies share their amperage across multiple 12V rails. This was originally meant to distribute the current equally, making sure nothing failed. However, no one checked with the video card companies, who had figured out a way to dynamically alter the voltage required by the video card based on use. This gave us all those neat features like cards that can overclock when you start a game or where the fan changes speed based on the heat of the card. However, it also means those shared rails get all confused and try to balance the load, often causing graphical slowdown and hard drive errors.
This means a power supply may say "30A" and then have 4 12V rails sharing that 30A and balancing the load, which will not be enough for a video card that says "minimum 30A on the 12V rail". You will get tons of errors. This gives you two choices: shoot for about 5 amps higher than your card says, which may be pretty hard to find, or find a power supply that offers the amperage you need in a single rail.