Zum 3DCenter Forum
Inhalt




Interview with ATI's Greg Ellis

March 31, 2005 / by Leonidas / Page 2 of 2


3DCenter:  Does ATI posess a mainboard in the testlab which has a PCI Express and an AGP slot (whose AGP slot however is connected directly to the Northbridge and not via PCI), so that you can test AGP and PCI Express boards on the same mainboard?

Greg Ellis:  I certainly can't speak for all of ATI, but I don't have such a beast in my lab.

It's an interesting idea, but it would take some time to calibrate the results, to be sure that the PCI-E and AGP slots on such a motherboard produced performance that was comparable to the more traditional PCI-E-only and AGP-only systems.


3DCenter:  Do you have equipment to measure the exact power usage of a graphic adapter or how do you measure this? Why does ATI not publicise the exact power draw, this is becoming more and more important. By the way, X-bit Labs had an interesting article about this.

Greg Ellis:  There are a number of ways to measure power usage, ranging from simple and cheap to complex and expensive. X-bit's approach seems like a good solution for their needs. Others have chosen simpler methods. Just as there is no right or wrong way to benchmark a particular game title, I don't think it's appropriate for us to dictate the "one and only correct method for measuring power consumption".

As for publicizing the "exact power draw" I think it would be difficult to arrive at a single exact number for this. Power draw varies by circumstance.


3DCenter:  How strongly do early measurements of the testlab influence the final clock rates of a new product?

Greg Ellis:  I'm not entirely sure how to answer this. Performance is an important factor in determining final clocks, but there are lots of other issues surrounding yields and heat and cost that also play a role.


3DCenter:  How early do you get access to competitors hardware after their introduction?

Greg Ellis:  "As soon as we can" is about as specific as I can be. It's different every time. Obviously, we are keenly interested in what all of our competitors are doing. It's a big day for us when we get a new product into our labs, whether it's one of our own or something new from a competitor.


3DCenter:  How about adding a list of games to your benchmark guides which cause problems in one way or another? This would be important for example for games that do not support Anti-Aliasing and for games which offer their own Anisotropic Filtering settings (so that you set the Control Panel setting to application preference).

Greg Ellis:  I think if you were to read through one of our guides you would find that such lists are already included. If you are aware of any omissions or errors, please send us your feedback!


3DCenter:  What philosphy do you have in regards to the benchmark results? The majority uses a medium of the measured results, others the second or third measured value. Personally, I always use the highest obtained value (because faster than the highest value is not possible).

Greg Ellis:  It really depends on the benchmark, and our experience with run-to-run variance in that particular test.

Some benchmarks produce the same result every time, so it's not necessary to look at multiple runs. Others wander considerably, so one of the methods you describe is required.

I don't think there is any single correct answer to this question. Any of the methods you mentioned are probably fine. It's good form to mention your method when you quote your results, just as you would mention AA settings, or the configuration options chosen from a game menu.


3DCenter:  What do you think about the obsession with numbers of parts of the hardware scene with regards to benchmark results of new products? At what level are measurable differences really relevant for you for the evaluation of a new graphic adapter?

Greg Ellis:  Performance is just one element, among many, that can be used to differentiate products. Certainly things like driver support, compatibility, price, ease of use, etc. must be considered as well.

As to what constitutes a "relevant" performance gap - there are many factors to consider, most of them subjective.

I think we can agree that very low framerates (e.g. 6 fps vs. 5) fail to discern a realistic difference because both products are obviously inappropriate to the task and settings under test.

The same is true of very high framerates (e.g. 275 fps vs. 240), where neither choice is a bad one - both products are obviously quite good at whatever it is that is being tested.

In the middle ground, subjectivity becomes important. One user might find 50 fps to be infinitely preferable to 40 fps in one particular game, while at the same time noticing no difference whatsoever between 38 fps and 32 in another title. Another user might have a different opinion.

I've noticed a number of reviewers who make a point of including their subjective impressions along with the scores they quote. I think that's a good thing.

At the same time, I would not wish to discourage the community of enthusiasts who spend time and money tuning up their machines to get the best possible number in a particular test. This can be fun and exciting too.


3DCenter:  What do you think about how benchmarks are presented on the web? Too many or too few numbers, too much or too little commentary? Or do you have ideas for new kinds of presentations?

Greg Ellis:  Every site is different, with different tests covered, different presentation, etc.

Personally, I like to see a focus on testing that is relevant to the products in question. Framerates in the 30-90 range are more interesting to me than very high or very low results, simply because I think that differences within this range have a greater impact on the users' subjective experience of the products.

As for commentary... well, I always like to see what the reviewer was thinking while he used the product. How did it look on the screen? Why did he select the particular map or timedemo that he used? Why is he using this game instead of some other? When a difference in performance is present, what does he think it means? Will it significantly impact the users' experience with the product, or not?


3DCenter:  Finally, do you sometimes play in the benchmark lab or do you only benchmark?

Greg Ellis:  Are you trying to get me into trouble with my bosses? ;)

One of the great things about testing performance in real games is that you need to play around with the games a bit, to figure out what sort of performance is typical, how much performance changes from section to section, how the various graphics options offered work, etc. So yes, we have a chance to play a bit, to get a feel for the games before we start collecting data. And there's always after hours ...


... something Greg Ellis rightly deserves. We thank him for his time and the interesting answers pertaining to this often overlooked subject.






comment this article in our english forum - registration not required Zurück / Back 3DCenter-Artikel - Index Home

Shortcuts
nach oben