BenchmarkXPRT Blog banner

Category: Performance benchmarking

Testing XPRT compatibility with Windows 11

Last week, Microsoft announced that the Windows 11 GA build will officially launch Tuesday October 5, earlier than the initial late 2021 estimate. The update will start rolling out with select new laptops and existing Windows 10 PCs that satisfy specific system requirements, and only some Windows 10 PCs will be eligible for the update right away. Through a phased Windows Update process, additional Windows 10 PCs will be able to access the update throughout the first half of 2022.

Between the phased Windows 11 rollout and the pledge Microsoft has made to continue Windows 10 support through October 2025, it will likely be a while before the majority of Windows users transition to the new version. We hope the transition period will go smoothly for the XPRTs. However, because we designed three of our benchmarks to run on Windows 10 (HDXPRT 4, TouchXPRT 2016, and AIXPRT), we might encounter compatibility issues with Windows 11.

Over the coming weeks, we’ll be testing HDXPRT 4, TouchXPRT 2016, and AIXPRT on beta versions of Windows 11, and we’ll test again after the GA launch. In addition to obvious compatibility issues and test failures, we’ll note any changes we need to make to our documentation to account for differences in the Windows 11 installation or test processes.

We hope that testers will be able to successfully use all three benchmarks on both OS versions throughout the transition process. If problems arise, we will keep our blog readers informed while exploring solutions. As always, we’re also open to feedback from the community, so if you are participating in the Windows Insider Program and have encountered Windows 11 beta compatibility issues with any of the Windows-focused XPRTs, please let us know!

Justin

Investigating the possibility of WebXPRT user accounts

One of our goals during the ongoing WebXPRT 4 development process is to be as responsive as possible to user feedback, and we want to emphasize that it’s not too late to send us your ideas. Until we finalize the details for each workload and complete the code work for the preview build, we still have quite a bit of flexibility around adding new features.

Just this week, a community member raised the possibility of a WebXPRT 4 feature that would enable user-specific test ID numbers or accounts. One possible implementation of the idea would allow a user to sign up for a WebXPRT test account as an individual or on behalf of their organization. The test accounts would be both free and optional; you could continue to run the benchmark without an account, but running it with an account would let you save and view your test history. Another implementation option we are considering would let users generate a permanent user ID number for themselves or their organization. They could then use that number to tag and search for their automated test runs in our database, without having to log into an account.

Our biggest question at the moment is whether our user base would be interested in WebXPRT user accounts or test IDs. If this concept piques your interest, or you have suggestions for implementation, please let us know!

Justin

Using WebXPRT 3 to compare the performance of popular browsers (Round 3)

In November, we published our WebXPRT 3 browser performance comparison, so we decided it was time to see if the performance rankings of popular browsers have changed in the last nine months.

For this round of tests, we used the same laptop as last time: a Dell XPS 13 7930 with an Intel Core i3-10110U processor and 4 GB of RAM running Windows 10 Home, updated to version 1909 (18363.1556). We installed all current Windows updates and tested on a clean system image. After the update process completed, we turned off updates to prevent them from interfering with test runs. We ran WebXPRT 3 three times each on five browsers: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. For each browser, the score we post below is the median of the three test runs.

In our last round of tests, the four Chromium-based browsers (Brave, Chrome, Edge, and Opera) produced very close scores, though we saw about a four percent lower score from Brave. In this round of testing, performance improved for all four of the Chromium-based browsers. Chrome, Edge, and Opera still produced very close scores, but Brave’s performance still lagged, this time by about seven percent.

Firefox separated itself from the pack with a much higher score and has been the clear winner in all three rounds of testing. During our second round of testing in November, every browser except for Chrome saw slightly slower performance than the first round. In these latest tests, all the Chromium-based browsers produced significantly higher scores than the second round. When discussing browser performance, it’s important to remember that there are many possible reasons for these performance changes—including changes in browser overhead or changes in Windows—and most users may not notice the changes during everyday tasks.

Do these results mean that Mozilla Firefox will always provide you with a speedier web experience? As we noted in previous comparisons, a device with a higher WebXPRT score will probably feel faster during daily use than one with a lower score. For comparisons on the same system, however, the answer depends on several factors, such as the types of things you do on the web, how the extensions you’ve installed affect performance, how frequently the browsers issue updates and incorporate new web technologies, and how accurately each browser’s default installation settings reflect how you would set up that browser for your daily workflow.

In addition, browser speed can increase or decrease significantly after an update, only to swing back in the other direction shortly thereafter. OS-specific optimizations can also affect performance, such as with Edge on Windows 10 or Chrome on Chrome OS. All these variables are important to keep in mind when considering how browser performance comparison results translate to your everyday experience. Do you have insights you’d like to share from using WebXPRT to compare browser performance? Let us know!

Justin

Improving WebXPRT-related tools and resources

As we move forward with the WebXPRT 4 development process, we’re also working on ways to enhance the value of WebXPRT beyond simply updating the benchmark. Our primary goal is to expand and improve the WebXPRT-related tools and resources we offer at WebXPRT.com, starting with a new results viewer.

Currently, users can view WebXPRT results on our site two primary ways, each of which has advantages and limitations.

The first way is the WebXPRT results viewer, which includes hundreds of PT-curated performance scores from a wide range of trusted sources and devices. Users can sort entries by device type, device name, device model, overall score, date of publication, and source. The viewer also includes a free-form filter for quick, targeted searches. While the results viewer contains a wealth of information, it does not give users a way to use graphs or charts for viewing and comparing multiple results at once. Another limitation of the current results viewer is that it offers no easy way for users to access the additional data about the test device and the subtest scores that we have for many entries.

The second way to view WebXPRT results on our site is the WebXPRT Processor Comparison Chart. The chart uses horizontal bar graphs to compare test scores from the hundreds of published results in our database, grouped by processor type. Users can click the average score for a processor to view all the WebXPRT results we currently have for that processor. The visual aspect of the chart and its automated “group by processor type” feature are very useful, but it lacks the sorting and filtering capabilities of the viewer, and navigating to the details of individual tests takes multiple clicks.

In the coming months, we’ll be working to combine the best features of the results viewer and the comparison chart into a single powerful WebXPRT results database tool. We’ll also be investigating ways to add new visual aids, navigation controls, and data-handling capabilities to that tool. We want to provide a tool that helps testers and analysts access the wealth of WebXPRT test information in our database in an efficient, productive, and enjoyable way. If you have ideas or comments about what you’d like to see in a new WebXPRT results viewing tool, please let us know!

Justin

We welcome your CloudXPRT results!

We recently published a set of CloudXPRT Data Analytics and Web Microservices workload test results submitted by Quanta Computer, Inc. The Quanta submission is the first set of CloudXPRT results that we’ve published using the formal results submission and approval process. We’re grateful to the Quanta team for carefully following the submission guidelines, enabling us to complete the review process without a hitch.

If you are unfamiliar with the process, you can find general information about how we review submissions in a previous blog post. Detailed, step-by-step instructions are available on the results submission page. As a reminder for testers who are considering submitting results for July, the submission deadline is tomorrow, Friday July 16, and the publication date is Friday July 30. We list the submission and publication dates for the rest of 2021 below. Please note that we do not plan to review submissions in December, so if we receive results submissions after November 30, we may not publish them until the end of January 2022.

August

Submission deadline: Tuesday 8/17/21

Publication date: Tuesday 8/31/21

September

Submission deadline: Thursday 9/16/21

Publication date: Thursday 9/30/21

October

Submission deadline: Friday 10/15/21

Publication date: Friday 10/29/21

November

Submission deadline: Tuesday 11/16/21

Publication date: Tuesday 11/30/21

December

Submission deadline: N/A

Publication date: N/A

If you have any questions about the CloudXPRT results submission, review, or publication process, please let us know!

Justin

Feedback from the WebXPRT 4 tech press survey

In early May, we sent a survey to members of the tech press who regularly use WebXPRT in articles and reviews. We asked for their thoughts on several aspects of WebXPRT, as well as what they’d like to see in the upcoming fourth version of the benchmark. We also published the survey questions here in the blog, and invited experienced WebXPRT testers to send their feedback as well. We received some good responses to the survey, and for the benefit of our readers, we’ve summarized some of the key comments and suggestions below.

  • One respondent stated that WebXPRT is demanding enough to test performance, but if we want to simulate modern web usage, we should find the most up-to-date studies on common browser tasks and web technologies. This suggestion lines up with our intention to study the feasibility of adding a WebAssembly workload
  • One respondent liked that fact that unlike many other browser benchmarks, WebXPRT tests more than just JavaScript calculation speed.
  • One respondent suggested that we include a link to a WebXPRT white paper within the UI, or at least a guide describing what happens during each workload.
  • One respondent stated that they would like for WebXPRT to automatically produce a good result file on the local test system.
  • One respondent said that WebXPRT has a relatively long runtime for a browser benchmark, and they would prefer that the runtime not increase in WebXPRT 4.
  • We had no direct calls for a battery life test, because many testers already have scripts and/or methodologies in place for battery testing, but one tester suggested adding the ability to loop the test so users can measure performance over varying lengths of time.
  • There were no requests to bring back any aspects of WebXPRT 2015 that we removed in WebXPRT 3.
  • There were no reports of significant connection issues when testing with WebXPRT.

We greatly appreciate the members of the tech press that responded to the survey. We’re still in the planning stages of WebXPRT 4, so there’s still time for anyone to send comments or ideas to benchmarkxprtsupport@principledtechnologies.com. We look forward to hearing from you!

Justin

Check out the other XPRTs:

Forgot your password?