BenchmarkXPRT Blog banner

Tag Archives: benchmark

Improving WebXPRT-related tools and resources

As we move forward with the WebXPRT 4 development process, we’re also working on ways to enhance the value of WebXPRT beyond simply updating the benchmark. Our primary goal is to expand and improve the WebXPRT-related tools and resources we offer at WebXPRT.com, starting with a new results viewer.

Currently, users can view WebXPRT results on our site two primary ways, each of which has advantages and limitations.

The first way is the WebXPRT results viewer, which includes hundreds of PT-curated performance scores from a wide range of trusted sources and devices. Users can sort entries by device type, device name, device model, overall score, date of publication, and source. The viewer also includes a free-form filter for quick, targeted searches. While the results viewer contains a wealth of information, it does not give users a way to use graphs or charts for viewing and comparing multiple results at once. Another limitation of the current results viewer is that it offers no easy way for users to access the additional data about the test device and the subtest scores that we have for many entries.

The second way to view WebXPRT results on our site is the WebXPRT Processor Comparison Chart. The chart uses horizontal bar graphs to compare test scores from the hundreds of published results in our database, grouped by processor type. Users can click the average score for a processor to view all the WebXPRT results we currently have for that processor. The visual aspect of the chart and its automated “group by processor type” feature are very useful, but it lacks the sorting and filtering capabilities of the viewer, and navigating to the details of individual tests takes multiple clicks.

In the coming months, we’ll be working to combine the best features of the results viewer and the comparison chart into a single powerful WebXPRT results database tool. We’ll also be investigating ways to add new visual aids, navigation controls, and data-handling capabilities to that tool. We want to provide a tool that helps testers and analysts access the wealth of WebXPRT test information in our database in an efficient, productive, and enjoyable way. If you have ideas or comments about what you’d like to see in a new WebXPRT results viewing tool, please let us know!

Justin

We welcome your CloudXPRT results!

We recently published a set of CloudXPRT Data Analytics and Web Microservices workload test results submitted by Quanta Computer, Inc. The Quanta submission is the first set of CloudXPRT results that we’ve published using the formal results submission and approval process. We’re grateful to the Quanta team for carefully following the submission guidelines, enabling us to complete the review process without a hitch.

If you are unfamiliar with the process, you can find general information about how we review submissions in a previous blog post. Detailed, step-by-step instructions are available on the results submission page. As a reminder for testers who are considering submitting results for July, the submission deadline is tomorrow, Friday July 16, and the publication date is Friday July 30. We list the submission and publication dates for the rest of 2021 below. Please note that we do not plan to review submissions in December, so if we receive results submissions after November 30, we may not publish them until the end of January 2022.

August

Submission deadline: Tuesday 8/17/21

Publication date: Tuesday 8/31/21

September

Submission deadline: Thursday 9/16/21

Publication date: Thursday 9/30/21

October

Submission deadline: Friday 10/15/21

Publication date: Friday 10/29/21

November

Submission deadline: Tuesday 11/16/21

Publication date: Tuesday 11/30/21

December

Submission deadline: N/A

Publication date: N/A

If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!

Justin

A huge milestone for XPRT runs and downloads!

We’re excited to have recently passed an important milestone: one million XPRT runs and downloads! Most importantly, that huge number does not just reflect past successes. As the chart below illustrates, XPRT use has grown steadily over the years. In 2021, we record, on average, more XPRT runs in one month (23,395) than we recorded in the entire first year we started tracking these stats (17,051).

We reached one million runs and downloads in about seven and a half years. At the current rate, we’ll reach two million in roughly three and a half more years. With WebXPRT 4 on the way, there’s a good chance we can reach that mark even sooner!

As always, we’re grateful for all the testers that have helped us reach this milestone. If you have any questions or comments about using any of the XPRTs to test your gear, let us know!

Justin

Publishing CloudXPRT results from testing on pre-production gear

We recently received questions about whether we accept CloudXPRT results submissions from testing on pre-production gear, and how we would handle any differences between results from pre-production and production-level tests.  

To answer first question, we are not opposed to pre-production results submissions. We realize that vendors often want to include benchmark results in launch-oriented marketing materials they release before their hardware or software is publicly available. To help them do so, we’re happy to consider pre-production submissions on a case-by-case basis. All such submissions must follow the normal CloudXPRT results submission process, and undergo vetting by the CloudXPRT Results Review Group according to the standard review and publication schedule. If we decide to publish pre-production results on our site, we will clearly note their pre-production status.

In response to the second question, the CloudXPRT Results Review Group will handle any challenges to published results or perceived discrepancies between pre-production and production-level results on a case-by-case basis. We do not currently have a formal process for challenges; anyone who would like to initiate a challenge or express comments or concerns about a result should address the review group via benchmarkxprtsupport@principledtechnologies.com. Our primary concern is always to ensure that published results accurately reflect the performance characteristics of production-level hardware and software. If it becomes necessary to develop more policies in the future, we’ll do so, but we want to keep things as simple as possible.

If you have any questions about the CloudXPRT results submission process, please let us know!

Justin

CrXPRT 2 battery life error on Chrome 89 and 90

In recent lab tests, we’ve encountered an error during the CrXPRT 2 battery life test that prevents the test from completing and producing a battery life estimate. As the screenshot below shows, when the error occurs, CrXPRT stops running its normal workload cycle and produces a “Test Error” page. We have seen this behavior on systems running Chrome OS v89.x and v90.x, across multiple vendor platforms. In our testing, Chromebooks running  Chrome OS v88.x and earlier versions continue to complete the battery life test without any issues.

The error occurs consistently on every Chromebook running v89.x or v90.x that we’ve tested so far. However, the timing of the error varies from run to run on the same system. Sometimes, CrXPRT stops running after only a few workload iterations, while at other times, the battery life test runs almost to completion before producing the error.

We’re actively investigating this problem, but have not yet identified the root cause. We apologize for the inconvenience that this error may be causing CrXPRT 2 testers. As soon as we identify the root cause of the problem and have ideas about possible solutions, we will share that information here in the blog. If you have any insight into recent Chrome OS changes or flag settings that could be causing this problem, please let us know!

Justin

The WebXPRT 4 tech press feedback survey

Device reviews in publications such as AnandTech, Notebookcheck, and PCMag, among many others, often feature WebXPRT test results, and we appreciate the many members of the tech press that use WebXPRT. As we move forward with the WebXPRT 4 development process, we’re especially interested in learning what longtime users would like to see in a new version of the benchmark.  

In previous posts, we’ve asked people to weigh in on the potential addition of a WebAssembly workload or a battery life test. We’d also like to ask experienced testers some other test-related questions. To that end, this week we’ll be sending a WebXPRT 4 survey directly to members of the tech press who frequently publish WebXPRT test results.

Regardless of whether you are a member of the tech press, we invite you to participate by sending your answers to any or all the questions below to benchmarkxprtsupport@principledtechnologies.com. We ask you to do so by the end of May.

  • Do you think WebXPRT 3’s selection of workload scenarios is representative of modern web tasks?
  • How do you think WebXPRT compares to other common browser-based benchmarks, such as JetStream, Speedometer, and Octane?
  • Are there web technologies that you’d like us to include in additional workloads?
  • Are you happy with the WebXPRT 3 user interface? If not, what UI changes would you like to see?
  • Are there any aspects of WebXPRT 2015 that we changed in WebXPRT 3 that you’d like to see us change back?
  • Have you ever experienced significant connection issues when testing with WebXPRT?
  • Given the array of workloads, do you think the WebXPRT runtime is reasonable? Would you mind if the average runtime were a bit longer?
  • Are there any other aspects of WebXPRT 3 that you’d like to see us change?

If you’d like to discuss any topics that we did not cover in the questions above, please feel free to include additional comments in your response. We look forward to hearing your thoughts!

Justin

Check out the other XPRTs:

Forgot your password?