How to measure the performance of a graphics-card?
A CPU’s performance is usually determined by it’s architecture (16-bit, 32-bit, 64-bit, etc.), the number of cores and it’s frequency.
What would be the equivalent for graphics-cards?
I’ve always wondered since most games state requirements like “Like XYZ or better”, but how can I know how “good” XYZ is? Only by it’s price?
Note: I know about benchmarking, but this only shows the FPS in a specific scenario and also involves other factors like CPU and RAM. It’s like saying “CPU XYZ generates X MD5 hashes per second” to measure the speed of a CPU (which is kinda how it is measured, but that’s not the question)
One Solution collect form web for “How to measure the performance of a graphics-card?”
Graphics cards have a number of important statistics. Like CPUs, GPUs can be measured by clock speed, number of cores, and bus bandwidth. Additionally, the amount of dedicated graphics memory will have a performance impact when running software that loads a lot of textures or other graphical data into memory. One statistic that can be calculated from these hardware attributes is fill rate, the number of pixels that can be rendered by the card per second.
Because there are so many different attributes that affect graphics card performance, the benchmarks are really the best way to compare cards on a general scale. If you have a particular application that you want a card for, certain statistics might be more important than others, but overall performance is generally best measured by benchmarking.