Zum 3DCenter Forum
Inhalt




Interview with ATI's Greg Ellis

March 31, 2005 / by Leonidas / Page 1 of 2


When asking an IHV for an interview, mostly you are referred to PR managers or developers working for the driver department. It is much harder to get access to somebody from the benchmark laboratory, but we had the opportunity to conduct an extensive e-mail interview with Greg Ellis, manager of ATI's benchmarking team in Toronto, Kanada ...


3DCenter:  How does your usual job look like? What are you main duties?

Greg Ellis:  I manage a benchmark lab which deals with all of the ATI PC products - integrated, mobile and desktop.

There are certainly other performance labs at ATI, serving the needs of developers and designers internally. My lab is the one that is most often called upon to produce performance data for publication.

I have four staff working for me, running benchmarks and games and analyzing the results. We test a wide variety of current and older ATI product configurations as well as competitor products.

We conduct extensive testing on new Catalyst drivers across approximately 15 different ATI product configurations in order to generate the Performance Highlights for the Catalyst release notes.

We also produce the Benchmark Guides that accompany new products for review. These contain reference scores, so reviewers can check that the products are performing correctly, as well as some tips about benchmarking particular titles, and a summary of what we have been testing recently in the ATI Benchmark Lab.

Periodically, we do some modelling work to predict the performance of upcoming parts.


3DCenter:  From which stage of the product preparation do you have internal test results which show the real world performance of the product?

Greg Ellis:  Performance is a consideration in product design right from the start. We need to be aware of the demands of the market and the likely maneuvers of our competition long before we settle on a chip architecture. This means we are already thinking about performance more than two years before a product is available for sale.

At several stages during product development, performance is measured on both software and hardware simulators, to make sure we are on the right track. As soon as we have chips, we start measuring the real performance of the part. Often, at this point, we are not certain of the final clocks, so we explore performance results across a variety of clock configurations.

It is not until we are almost ready to ship the product that all of the pieces finally fall into place. Final clocks are set in a manner that accomplishes the desired chip yield and also the desired performance. Final tuning of the BIOS occurs around this time, and a final driver is selected from amongst a few candidates. We often have only a day or two to generate final reference scores between the time these decisions are made and the product is launched to the media and to the world.


3DCenter:  How widely is the benchmark/ test field diversified? Especially compared to the usual hardware tests which only contain 3-4 benchmarks?

Greg Ellis:  We generate data in an automated fashion for perhaps 40 different benchmark tests. Normally we look at three to six resolutions per product, and various combinations of AA and AF, depending on how powerful the product is. Older, lower-end products (RADEON 7000, for instance) receive a somewhat simpler set of tests, while newer top-end boards are tested more exhaustively.

We also conduct a great deal of manual testing, using tools like FRAPS to evaluate performance in new game titles that don't have benchmarks built-in. We think this is probably the most important part of our work, since our end-users are most interested in real game performance and not benchmark scores.


3DCenter:  Which benchmarks in addition to the well known titles can you recommend to me?

Greg Ellis:  Our method of selecting titles for benchmark analysis is simple. We look for new titles (released this year) that have great-looking graphics and enjoyable gameplay. Popularity is also important - a strong-selling game means many more users out there who will benefit (or suffer) from our performance in that title.

We try to include a selection of titles from different genres. Most benchmarking we see in reviews typically focusses on 1st person shooters, but there are lots of gamers out there playing RTS, Racing Simulators, RPG and Sports titles.

You will find a list of the titles we selected for analysis in the X700 Benchmark Guide. This is just a starting point, of course. It is our hope that other analysts will follow the same basic principles of selection (new titles with good graphics, good gameplay and popularity) to come up with their own list of titles to evaluate.


3DCenter:  How often do you discover performance anomalities, meaning benchmark results that shouldn't be possible? Do you research them until you find a solution?

Greg Ellis:  Sure - this happens all the time, both inside of our lab and elsewhere. Normally it's a problem with the system configuration, or the driver installation, or an invalid assumption.

An example of the last case might be the Half-Life 2 engine - some early versions of Counter-Strike: Source would auto-configure a high-end NVIDIA board for 6xAA, a mode that is not supported on that hardware. The resulting scores would be startlingly fast, especially when compared to an ATI product that was actually rendering in 6xAA. Once we figured out what was happening, it was relatively simple to address.


3DCenter:  How strongly do you evaluate benchmarks yourself? Or do you trust what comes from the game developers or from the web?

Greg Ellis:  We spend a fair amount of time testing benchmarks - it's one of our primary responsibilities, after all. There are lots of other groups within ATI that are very interested in benchmarks, and we all work together to a certain extent, sharing data and analysis.

We are of course open to new benchmarks and benchmark-methods, and if we find something interesting on the review sites or coming from the game developers, we will evaluate it in our lab.


3DCenter:  Do you talk to game developers about the integration of additional benchmark functionalities in their games: Are these talks successful?

Greg Ellis:  I am occasionally in contact with developers, but not very much. There are lots of people at ATI whose fulltime job is to talk to developers about whatever they need help with, and certainly we bring up the subject of benchmarking from time-to-time.


3DCenter:  Why is there no (better) co-operation with experienced benchmarkers at hardware websites? Especially a company that states their hardware products run well with the majority of games (and not only a few benchmarks) should be interested in spreading benchmarks (or benchmark guidelines).

Greg Ellis:  The Reviewer's Guide that we distribute with the review samples does actually include that kind of information. This document lists new games, suggestions on how to benchmark them, general benchmarking advise using different tools, how to benchmark video acceleration and more. But we can't of course force reviewers what benchmark to use, we can only give advice and food for thought.






comment this article in our english forum - registration not required Weiter / Next

Shortcuts
nach oben