Last month, we discussed a potential fix for the error that was preventing CrXPRT 2 testers from successfully completing battery life tests on systems running Chrome v89.x and later. Since then, we’ve been testing an updated, unpublished version of the app package across several Chromebook models to ensure that the new build is stable and produces consistent results. We’re happy to report that our testing was successful, and we’ve published the new CrXPRT build (v1.2.0.0) in the Chrome Web Store and it is live as of 12:45 PM EDT today.
Note
that it might take some time for the update to appear on your Chromebook and,
once it does, you might have to manually approve the update notice.
Neither
the tests nor the method of calculating the overall score and battery-life
score in this new build have changed, so results are comparable with previous
CrXPRT 2 results.
We appreciate everyone’s patience while we found a solution to the error. If you have any questions or comments about the CrXPRT 2 battery life test, please feel free to contact us!
Last
week, we shared some new details
about the changes we’re likely to make in WebXPRT 4, and a rough target date
for publishing a preview build. This week, we’re excited to share an early
preview of the new results viewer tool that we plan to release in conjunction
with WebXPRT 4. We hope the tool will help testers and analysts access the
wealth of WebXPRT test results in our database in an efficient, productive, and
enjoyable way. We’re still ironing out many of the details, so some aspects of
what we’re showing today might change, but we’d like to give you an idea of
what to expect.
The screenshot below shows the tool’s default display. In this example, the viewer displays over 650 sample results—from a wide range of device types—that we’re currently using as placeholder data. The viewer will include several sorting and filtering options, such as device type, hardware specs such as browser type and processor vendor, the source of the result, etc.
Each
vertical bar in the graph represents the overall score of single test result,
and the graph presents the scores in order from lowest to highest. To view an
individual result in detail, the user simply hovers over and selects the bar
representing the result. The bar turns dark blue, and the dark blue banner at
the bottom of the viewer displays details about that result.
In the example above, the banner shows the overall score (250) and the score’s percentile rank (85th) among the scores in the current display. In the final version of the viewer, the banner will also display the device name of the test system, along with basic hardware disclosure information. Selecting the Run details button will let users see more about the run’s individual workload scores.
We’re
still working on a way for users to pin or save specific runs. This would let
users easily find the results that interest them, or possibly select multiple
runs for a side-by-side comparison.
We’re excited about this new tool, and we look forward to sharing more details here in the blog as we get closer to taking it live. If you have any questions or comments about the results viewer, please feel free to contact us!
As we move forward with the WebXPRT 4 development
process, we’re also working on ways to enhance the value of WebXPRT beyond simply
updating the benchmark. Our primary goal is to expand and improve the WebXPRT-related
tools and resources we offer at WebXPRT.com, starting with a new results
viewer.
Currently, users can view
WebXPRT results on our site two primary ways, each of which has advantages and
limitations.
The first way is the WebXPRT results viewer, which includes hundreds of
PT-curated performance scores from a wide range of trusted sources and devices.
Users can sort entries by device type, device name, device model, overall
score, date of publication, and source. The viewer also includes a free-form
filter for quick, targeted searches. While the results viewer contains a wealth
of information, it does not give users a way to use graphs or charts for
viewing and comparing multiple results at once. Another limitation of the
current results viewer is that it offers no easy way for users to access the
additional data about the test device and the subtest scores that we have for
many entries.
The second way to view WebXPRT
results on our site is the WebXPRT Processor Comparison
Chart. The
chart uses horizontal bar graphs to compare test scores from the hundreds of
published results in our database, grouped by processor type. Users can click
the average score for a processor to view all the WebXPRT results we currently
have for that processor. The visual aspect of the chart and its automated
“group by processor type” feature are very useful, but it lacks the sorting and
filtering capabilities of the viewer, and navigating to the details of
individual tests takes multiple clicks.
In the coming months, we’ll be working to combine the best features of the results viewer and the comparison chart into a single powerful WebXPRT results database tool. We’ll also be investigating ways to add new visual aids, navigation controls, and data-handling capabilities to that tool. We want to provide a tool that helps testers and analysts access the wealth of WebXPRT test information in our database in an efficient, productive, and enjoyable way. If you have ideas or comments about what you’d like to see in a new WebXPRT results viewing tool, please let us know!
We
recently published a set of CloudXPRT Data Analytics and Web Microservices
workload test results
submitted by Quanta Computer, Inc.
The Quanta submission is the first set of CloudXPRT results that we’ve
published using the formal results submission and approval process.
We’re grateful to the Quanta team for carefully following the submission
guidelines, enabling us to complete the review process without a hitch.
If you are unfamiliar
with the process, you can find general information about how we review
submissions in a previous blog post.
Detailed, step-by-step instructions are available on the results submission page.
As a reminder for testers who are considering submitting results for July, the
submission deadline is tomorrow, Friday July 16, and the publication date is
Friday July 30. We list the submission and publication dates for the rest of
2021 below. Please note that we do not plan to review submissions in December,
so if we receive results submissions after November 30, we may not publish them
until the end of January 2022.
August
Submission deadline: Tuesday 8/17/21
Publication date: Tuesday 8/31/21
September
Submission deadline: Thursday 9/16/21
Publication date: Thursday 9/30/21
October
Submission deadline: Friday 10/15/21
Publication date: Friday 10/29/21
November
Submission deadline: Tuesday 11/16/21
Publication date: Tuesday 11/30/21
December
Submission deadline: N/A
Publication date: N/A
If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!
It’s
been a while since we last discussed the process for submitting WebXPRT results
to be considered for publication in the WebXPRT results browser
and the WebXPRT Processor Comparison Chart, so we
thought we’d offer a refresher.
Unlike
sites that publish all results they receive, we hand-select results from
internal lab testing, user submissions, and reliable tech media sources. In
each case, we evaluate whether the score is consistent with general expectations.
For sources outside of our lab, that evaluation includes confirming that there
is enough detailed system information to help us determine whether the score
makes sense. We do this for every score on the WebXPRT results page and the
general XPRT results page.
All WebXPRT results we publish automatically appear in the processor comparison
chart as well.
Submitting your score is quick and easy. At the end of the WebXPRT test run, click the Submit your results button below the overall score, complete the short submission form, and click Submit again. The screenshot below shows how the form would look if I submitted a score at the end of a WebXPRT 3 run on my personal system.
After you submit your score, we’ll contact you to confirm how we should display
the source. You can choose one of the following:
Your first and last name
“Independent tester” (for those
who wish to remain anonymous)
Your company’s name, provided
that you have permission to submit the result in their name. To use a
company name, we ask that you provide a valid company email address.
We will
not publish any additional information about you or your company without your
permission.
We look forward to seeing your score submissions, and if you have suggestions for the processor chart or any other aspect of the XPRTs, let us know!
We recently received questions about whether we accept CloudXPRT
results submissions from testing on pre-production gear, and how we would handle
any differences between results from pre-production and production-level tests.
To answer first question, we are not opposed to pre-production
results submissions. We realize that vendors often want to include benchmark
results in launch-oriented marketing materials they release before their
hardware or software is publicly available. To help them do so, we’re happy to
consider pre-production submissions on a case-by-case basis. All such submissions
must follow the normal CloudXPRT results
submission process, and undergo
vetting by the CloudXPRT Results Review Group according to the standard review
and publication schedule. If we decide to publish pre-production results on our site, we
will clearly note their pre-production status.
In response to the second question, the CloudXPRT Results Review Group will handle any challenges to published results or perceived discrepancies between pre-production and production-level results on a case-by-case basis. We do not currently have a formal process for challenges; anyone who would like to initiate a challenge or express comments or concerns about a result should address the review group via benchmarkxprtsupport@principledtechnologies.com. Our primary concern is always to ensure that published results accurately reflect the performance characteristics of production-level hardware and software. If it becomes necessary to develop more policies in the future, we’ll do so, but we want to keep things as simple as possible.
If you have any questions about the CloudXPRT results submission process, please let us know!
Cookie Notice: Our website uses cookies to deliver a smooth experience by storing logins and saving user information. By continuing to use our site, you agree with our usage of cookies as our privacy policy outlines.