BenchmarkXPRT Blog banner

Tag Archives: test results

Improving WebXPRT-related tools and resources

As we move forward with the WebXPRT 4 development process, we’re also working on ways to enhance the value of WebXPRT beyond simply updating the benchmark. Our primary goal is to expand and improve the WebXPRT-related tools and resources we offer at WebXPRT.com, starting with a new results viewer.

Currently, users can view WebXPRT results on our site two primary ways, each of which has advantages and limitations.

The first way is the WebXPRT results viewer, which includes hundreds of PT-curated performance scores from a wide range of trusted sources and devices. Users can sort entries by device type, device name, device model, overall score, date of publication, and source. The viewer also includes a free-form filter for quick, targeted searches. While the results viewer contains a wealth of information, it does not give users a way to use graphs or charts for viewing and comparing multiple results at once. Another limitation of the current results viewer is that it offers no easy way for users to access the additional data about the test device and the subtest scores that we have for many entries.

The second way to view WebXPRT results on our site is the WebXPRT Processor Comparison Chart. The chart uses horizontal bar graphs to compare test scores from the hundreds of published results in our database, grouped by processor type. Users can click the average score for a processor to view all the WebXPRT results we currently have for that processor. The visual aspect of the chart and its automated “group by processor type” feature are very useful, but it lacks the sorting and filtering capabilities of the viewer, and navigating to the details of individual tests takes multiple clicks.

In the coming months, we’ll be working to combine the best features of the results viewer and the comparison chart into a single powerful WebXPRT results database tool. We’ll also be investigating ways to add new visual aids, navigation controls, and data-handling capabilities to that tool. We want to provide a tool that helps testers and analysts access the wealth of WebXPRT test information in our database in an efficient, productive, and enjoyable way. If you have ideas or comments about what you’d like to see in a new WebXPRT results viewing tool, please let us know!

Justin

We welcome your CloudXPRT results!

We recently published a set of CloudXPRT Data Analytics and Web Microservices workload test results submitted by Quanta Computer, Inc. The Quanta submission is the first set of CloudXPRT results that we’ve published using the formal results submission and approval process. We’re grateful to the Quanta team for carefully following the submission guidelines, enabling us to complete the review process without a hitch.

If you are unfamiliar with the process, you can find general information about how we review submissions in a previous blog post. Detailed, step-by-step instructions are available on the results submission page. As a reminder for testers who are considering submitting results for July, the submission deadline is tomorrow, Friday July 16, and the publication date is Friday July 30. We list the submission and publication dates for the rest of 2021 below. Please note that we do not plan to review submissions in December, so if we receive results submissions after November 30, we may not publish them until the end of January 2022.

August

Submission deadline: Tuesday 8/17/21

Publication date: Tuesday 8/31/21

September

Submission deadline: Thursday 9/16/21

Publication date: Thursday 9/30/21

October

Submission deadline: Friday 10/15/21

Publication date: Friday 10/29/21

November

Submission deadline: Tuesday 11/16/21

Publication date: Tuesday 11/30/21

December

Submission deadline: N/A

Publication date: N/A

If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!

Justin

How to submit WebXPRT results for publication

It’s been a while since we last discussed the process for submitting WebXPRT results to be considered for publication in the WebXPRT results browser and the WebXPRT Processor Comparison Chart, so we thought we’d offer a refresher.

Unlike sites that publish all results they receive, we hand-select results from internal lab testing, user submissions, and reliable tech media sources. In each case, we evaluate whether the score is consistent with general expectations. For sources outside of our lab, that evaluation includes confirming that there is enough detailed system information to help us determine whether the score makes sense. We do this for every score on the WebXPRT results page and the general XPRT results page. All WebXPRT results we publish automatically appear in the processor comparison chart as well.

Submitting your score is quick and easy. At the end of the WebXPRT test run, click the Submit your results button below the overall score, complete the short submission form, and click Submit again. The screenshot below shows how the form would look if I submitted a score at the end of a WebXPRT 3 run on my personal system.

After you submit your score, we’ll contact you to confirm how we should display the source. You can choose one of the following:

  • Your first and last name
  • “Independent tester” (for those who wish to remain anonymous)
  • Your company’s name, provided that you have permission to submit the result in their name. To use a company name, we ask that you provide a valid company email address.


We will not publish any additional information about you or your company without your permission.

We look forward to seeing your score submissions, and if you have suggestions for the processor chart or any other aspect of the XPRTs, let us know!

Justin

The Introduction to CloudXPRT white paper is now available!

Today, we published the Introduction to CloudXPRT white paper. The paper provides an overview of our latest benchmark and consolidates CloudXPRT-related information that we’ve published in the XPRT blog over the past several months. It describes the CloudXPRT workloads, choosing and downloading installation packages, submitting CloudXPRT results for publication, and possibilities for additional development in the coming months.

CloudXPRT is one of the most complex tools in the XPRT family, and there are more CloudXPRT-related topics to discuss than we could fit in this first paper. In future white papers, we will discuss in greater detail each of the benchmark workloads, the range of test configuration options, results reporting, and methods for analysis.

We hope that Introduction to CloudXPRT will provide testers who are interested in CloudXPRT with a solid foundation of understanding on which they can build. Moving forward, we will provide links to the paper in the Helpful Info box on CloudXPRT.com and the CloudXPRT section of our XPRT white papers page.

If you have any questions about CloudXPRT, please let us know!

Justin

The CloudXPRT results viewer is live

We’re happy to announce that the CloudXPRT results viewer is now live with results from the first few rounds of CloudXPRT Preview testing we conducted in our lab. Here are some tips to help you to navigate the viewer more efficiently:

  • Click the tabs at the top of the table to switch from Data analytics workload results to Web microservices workload results.
  • Click the header of any column to sort the data on that variable. Single click to sort A to Z and double-click to sort Z to A.
  • Click the link in the Source/details column to visit a detailed page for that result, where you’ll find additional test configuration and system hardware information and the option to download results files.
  • By default, the viewer displays eight results per page, which you can change to 16, 48, or Show all.
  • The free-form search field above the table lets you filter for variables such as cloud service or processor.

We’ll be adding more features, including expanded filtering and sorting mechanisms, to the results viewer in the near future. We’re also investigating ways to present multiple data points in a graph format, which will allow visitors to examine performance behavior curves in conjunction with factors such as concurrency and resource utilization.

We welcome your CloudXPRT results submissions! To learn about the new submission and review process we’ll be using, take a look at last week’s blog.

If you have any questions or suggestions for ways that we can improve the results viewer, please let us know!

Justin

The CloudXPRT Preview results submission schedule

A few weeks ago, we shared the general framework of the periodic results publication process we will use for CloudXPRT. Now that the CloudXPRT Preview is live, we’re ready to share more details about the results review group; the submission, review, and publication cycles; and the schedule for the first three months.

The results review group
The CloudXPRT results review group will serve as a sanity check and a forum for comments on each month’s submissions. All registered BenchmarkXPRT Development Community members who wish to participate in the review process can join the group by contacting us via email. We’ll confirm receipt of your request and add you to the review group mailing list. Any non-members who would like to join the review group can contact us and we’ll help you become community members.

The submission, review, and publication cycle
We will update the CloudXPRT results database once a month on a published schedule. While testers can submit results through the CloudXPRT results submission page at any time, two weeks prior to each publication date, we will close submissions for that review cycle. One week prior to each publication date, we will email details of that month’s submissions to the results review group, along with the deadline for sending post-publication feedback.

Schedule for the first three publication cycles
We will publish results to the database on the last business day of each month and will close the submission window at 11:59 PM on the business day that falls two weeks earlier (with occasional adjustments for holidays). The schedule will be available at least six months in advance on CloudXPRT.com.

The schedule for the first three cycles is as follows:

July
Submission deadline: Friday 7/17/20
Publication date: Friday 7/31/20
August
Submission deadline: Monday 8/17/20
Publication date: Monday 8/31/20
September
Submission deadline: Wednesday 9/16/20
Publication date: Wednesday 9/30/20

As a reminder, members of the tech press, vendors, and other testers are free to publish CloudXPRT results at any time. We may choose to add such results to our database on the monthly publication date, after first vetting them.

We look forward to reviewing the first batch of results! If you have any questions about CloudXPRT or the results submission or review process, let us know!

Justin

Check out the other XPRTs:

Forgot your password?