We’re excited to have recently passed an important milestone: one million XPRT runs and downloads! Most importantly, that huge number does not just reflect past successes. As the chart below illustrates, XPRT use has grown steadily over the years. In 2021, we record, on average, more XPRT runs in one month (23,395) than we recorded in the entire first year we started tracking these stats (17,051).
We reached one million
runs and downloads in about seven and a half years. At the current rate, we’ll
reach two million in roughly three and a half more years. With WebXPRT 4 on the way, there’s a good chance we can reach that mark even sooner!
As always, we’re grateful for all the testers that have helped us reach this milestone. If you have any questions or comments about using any of the XPRTs to test your gear, let us know!
It’s
been a while since we last discussed the process for submitting WebXPRT results
to be considered for publication in the WebXPRT results browser
and the WebXPRT Processor Comparison Chart, so we
thought we’d offer a refresher.
Unlike
sites that publish all results they receive, we hand-select results from
internal lab testing, user submissions, and reliable tech media sources. In
each case, we evaluate whether the score is consistent with general expectations.
For sources outside of our lab, that evaluation includes confirming that there
is enough detailed system information to help us determine whether the score
makes sense. We do this for every score on the WebXPRT results page and the
general XPRT results page.
All WebXPRT results we publish automatically appear in the processor comparison
chart as well.
Submitting your score is quick and easy. At the end of the WebXPRT test run, click the Submit your results button below the overall score, complete the short submission form, and click Submit again. The screenshot below shows how the form would look if I submitted a score at the end of a WebXPRT 3 run on my personal system.
After you submit your score, we’ll contact you to confirm how we should display
the source. You can choose one of the following:
Your first and last name
“Independent tester” (for those
who wish to remain anonymous)
Your company’s name, provided
that you have permission to submit the result in their name. To use a
company name, we ask that you provide a valid company email address.
We will
not publish any additional information about you or your company without your
permission.
We look forward to seeing your score submissions, and if you have suggestions for the processor chart or any other aspect of the XPRTs, let us know!
A few weeks ago, we discussed an error that we’d recently started encountering during the CrXPRT 2 battery life test on systems running Chrome OS v89.x and later.
The error prevents the test from completing and producing a battery life
estimate. CrXPRT stops running its normal workload cycle and produces a “Test
Error” page. The timing of the error can vary from run to run. Sometimes,
CrXPRT stops running after only a few workload iterations, while other times,
the battery life test almost reaches completion before producing the error.
We have seen the error on across multiple brands of Chromebooks running
Chrome OS v89.x and later. To our knowledge, Chromebooks running Chrome OS v88.x
and earlier versions complete the battery life test without issues. We are unaware
of any problems with the CrXPRT 2 performance test.
We’re continuing to investigate this problem. Unfortunately, we have not yet identified the root cause. Without a solution, we are recommending that for now, testers not use the CrXPRT 2 battery life test. We will post this recommendation on CrXPRT.com.
We apologize for the inconvenience that this error is causing CrXPRT 2 testers. As soon as we identify a possible solution, we will share that information here in the blog. If you have any insight into recent Chrome OS changes or flag settings that could be causing this problem, please let us know!
In early May, we sent
a survey to members of the tech press who regularly use WebXPRT in articles and
reviews. We asked for their thoughts on several aspects of WebXPRT, as well as what
they’d like to see in the upcoming fourth version of the benchmark. We also
published the survey questions here in the blog, and invited
experienced WebXPRT testers to send their feedback as well. We received some
good responses to the survey, and for the benefit of our readers, we’ve
summarized some of the key comments and suggestions below.
One respondent stated that WebXPRT is demanding enough to test
performance, but if we want to simulate modern web usage, we should find the
most up-to-date studies on common browser tasks and web technologies. This
suggestion lines up with our intention to study the feasibility of adding a WebAssembly workload.
One respondent liked that fact that unlike many other browser
benchmarks, WebXPRT tests more than just JavaScript calculation speed.
One respondent suggested that we include a link to a WebXPRT
white paper within the UI, or at least a guide describing what happens during
each workload.
One respondent stated that they would like for WebXPRT to
automatically produce a good result file on the local test system.
One respondent said that WebXPRT has a relatively long runtime
for a browser benchmark, and they would prefer that the runtime not increase in
WebXPRT 4.
We had no direct calls for a battery life test, because many
testers already have scripts and/or methodologies in place for battery testing,
but one tester suggested adding the ability to loop the test so users can measure
performance over varying lengths of time.
There were no requests to bring back any aspects of WebXPRT 2015
that we removed in WebXPRT 3.
There were no reports of significant connection issues when
testing with WebXPRT.
We greatly appreciate the members of the tech press that responded to the survey. We’re still in the planning stages of WebXPRT 4, so there’s still time for anyone to send comments or ideas to benchmarkxprtsupport@principledtechnologies.com. We look forward to hearing from you!
We recently received questions about whether we accept CloudXPRT
results submissions from testing on pre-production gear, and how we would handle
any differences between results from pre-production and production-level tests.
To answer first question, we are not opposed to pre-production
results submissions. We realize that vendors often want to include benchmark
results in launch-oriented marketing materials they release before their
hardware or software is publicly available. To help them do so, we’re happy to
consider pre-production submissions on a case-by-case basis. All such submissions
must follow the normal CloudXPRT results
submission process, and undergo
vetting by the CloudXPRT Results Review Group according to the standard review
and publication schedule. If we decide to publish pre-production results on our site, we
will clearly note their pre-production status.
In response to the second question, the CloudXPRT Results Review Group will handle any challenges to published results or perceived discrepancies between pre-production and production-level results on a case-by-case basis. We do not currently have a formal process for challenges; anyone who would like to initiate a challenge or express comments or concerns about a result should address the review group via benchmarkxprtsupport@principledtechnologies.com. Our primary concern is always to ensure that published results accurately reflect the performance characteristics of production-level hardware and software. If it becomes necessary to develop more policies in the future, we’ll do so, but we want to keep things as simple as possible.
If you have any questions about the CloudXPRT results submission process, please let us know!
In recent lab tests, we’ve encountered an error during the CrXPRT 2 battery life test that prevents the test from completing and producing a battery life estimate. As the screenshot below shows, when the error occurs, CrXPRT stops running its normal workload cycle and produces a “Test Error” page. We have seen this behavior on systems running Chrome OS v89.x and v90.x, across multiple vendor platforms. In our testing, Chromebooks running Chrome OS v88.x and earlier versions continue to complete the battery life test without any issues.
The error occurs consistently on every Chromebook running v89.x or
v90.x that we’ve tested so far. However, the timing of the error varies from
run to run on the same system. Sometimes, CrXPRT stops running after only a few
workload iterations, while at other times, the battery life test runs almost to
completion before producing the error.
We’re actively investigating this problem, but have not yet
identified the root cause. We apologize for the inconvenience that this error
may be causing CrXPRT 2 testers. As soon as we identify the root cause of the
problem and have ideas about possible solutions, we will share that information
here in the blog. If you have any insight into recent Chrome OS changes or flag
settings that could be causing this problem, please let us know!
Cookie Notice: Our website uses cookies to deliver a smooth experience by storing logins and saving user information. By continuing to use our site, you agree with our usage of cookies as our privacy policy outlines.