[quote1272637861=OG buckshot jr]
Referring to Hinouchi's question, as well as Cock's comments: Size whore, sure, for the same money, who shouldn't be. But here's why. The "bit-bus" in a system (in this case, a video card system) is the size of the bus that carries information to and from the processor and ram. This could be looked at like a front-side bus, perhaps. Think of a water-pump system. A tank of water, a pump drawing water out of the tank to fill a pool. If the hose your using is really thin, it won't matter how fast/how many horse-power your pump is because the thickness (or lack thereof) of the hose will be the limiting factor in the amount of water per minute (as an example of measurement).
Now, with high memory clocks, and high core clock (gpu) speed, a relatively small bit-bus such as 128-bit would be the limiting factor. Not so on slower video cards, but as these video cards are being produced increasingly faster, a higher bit-bus of AT LEAST 256bit will make a world of difference.
This same movement (from lower to higher bit-bus) can also be found in the architecture of Operating Systems -> From 32bit to 64bit-bus, it's no different (in terms of design and intent).
That's why I rank it so important. All good cards are minimum 256-bit, if not 384-bit.
[/quote1272637861]
did some googleing, and found out that "It is the same since GDDR5 transfers twice the amount of data compared to GDDR3, hence it would equivalent to doubling bus width"
so in the end, it's same shit.
but GDDR5 use less power and such + with DX11