BenchmarkXPRT Blog banner

Category: Collaborative benchmark development

An example of the community in action

Last week, I hosted a Webinar on HDXPRT. We’ll make a recording of it available on the site fairly soon. Multiple members attended. As I was going through the slides and discussing various aspects of the benchmark, a member asked about installing the benchmark from a USB key or a server. My response was the simple truth: we hadn’t considered that approach. As I then elaborated, we clearly should have thought about it, because those capabilities would be useful in just about every production lab out there, including ours here at PT. I concluded by saying that we’d look into it.

I’m not naming the member simply because with big companies I’m never sure if doing that will be good or will cause someone trouble, and I don’t want to cause hassle for anyone. He should, though, feel free to step forward and claim the well-deserved credit for the suggestion.

Less than a week after the Webinar, I’m happy to be able to report that the team has done more than look into these capabilities; it’s implemented them! So, the next Beta release, Beta 2, which we’ll be releasing any time now (maybe even before we post this blog entry), lets you install the benchmark from a network share or a USB key.

I know this is a relatively small thing, but I think it bears reporting because it is exactly the way the community should work. A member brought the benefits of his experience to bear in a great bit of feedback, and now the benchmark is better for it—and so are all of us who use it.

Keep the good ideas coming!

Mark Van Name

Comment on this post in the forums

Our community’s goal

Computer system performance evaluation has a long and complex history. Many of the earliest tests were simple, short code snippets, such as Whetstone, that did little more than give an indication of how fast a particular computer subsystem was able to operate. Unfortunately, such simple benchmarks quickly lost their value, in part because they were very crude measures, and in part because software tools on the things they were measuring could easily optimize for them. In some cases, a compiler could even recognize a test and “optimize” the code by simply producing the final result!

Over time, though, benchmarks have become more complex and more relevant. Whole organizations exist and have existed to build benchmarks. Notable ones include the Ziff-Davis Benchmark Operation (ZDBOp), which the Ziff-Davis computer magazines funded in the 1990s and which Mark and I ran; the Standard Performance Evaluation Corporation (SPEC), which its member companies fund and of which PT is a member; and the Business Applications Performance Corporation (BAPCo), which its member companies fund. Each of these organizations has developed widely used products, such as Winstone (ZDBOp), SPEC CPU (SPEC), and SYSmark (BAPCo). Each organization has also always faced challenges. In the case of ZDBOp, for example, Ziff Davis could no longer support the costs of developing its benchmarks, so they discontinued the group. SPEC continues to develop good benchmarks, but its process can sometimes yield years between versions.

The goal with HDXPRT and the HDXPRT Development Community (HDC) is to explore a new way to develop benchmarks. By utilizing the expertise and experience of a community of interested people, we hope to be able develop benchmarks in an open and collaborative environment while keeping them timely.

HDXPRT 2011 is the first test of this approach. We believe that it and subsequent versions of it, as well as other benchmarks, will give the industry a new model for creating world-class performance measurement tools.

If you’re not a member of the HDC, please consider joining us and helping define the future of performance evaluation.

Bill

Comment on this post in the forums

Check out the other XPRTs:

Forgot your password?