BenchmarkXPRT Blog banner

Category: Performance of computing devices

XPRT mentions in the tech press

One of the ways we monitor the effectiveness of the XPRT family of benchmarks is to regularly track XPRT usage and reach in the global tech press. Many tech journalists invest a lot of time and effort into producing thorough device reviews, and relevant and reliable benchmarks such as the XPRTs often serve as indispensable parts of a reviewer’s toolkit. Trust is hard-earned and easily lost in the benchmarking community, so we’re happy when our benchmarks consistently achieve “go-to” status for a growing number of tech assessment professionals around the world.

Because some of our newer readers may be unaware of the wide variety of outlets that regularly use the XPRTs, we occasionally like to share an overview of recent XPRT-related tech press activity. For today’s blog, we want to give readers a sampling of the press mentions we’ve seen over the past few months.

Recent mentions include:

Each month, we send out a BenchmarkXPRT Development Community newsletter that contains the latest updates from the XPRT world and provides a summary of the previous month’s XPRT-related activity, including new mentions of the XPRTs in the tech press. If you don’t currently receive the monthly BenchmarkXPRT newsletter but would like to join the mailing list, please let us know! There is no cost to join, and we will not publish or sell any of the contact information you provide. We will send only the monthly newsletter and occasional benchmark-related announcements, such as news about patches or new releases.

Justin

WebXPRT benchmarking tips from the XPRT lab

Occasionally, we receive inquiries from XPRT users asking for help determining why two systems with the same hardware configuration are producing significantly different WebXPRT scores. This can happen for many reasons, including different software stacks, but score variability can also result from different testing behaviors and environments. While some degree of variability is normal, these types of questions provide us with an opportunity to talk about some of the basic benchmarking practices we follow in the XPRT lab to produce the most consistent and reliable scores.

Below, we list a few basic best practices you might find useful in your testing. Most of them relate to evaluating browser performance with WebXPRT, but several of these practices apply to other benchmarks as well.

  • Hardware is not the only important factor: Most people know that different browsers produce different performance scores on the same system. Testers are not, however, always aware of shifts in performance between different versions of the same browser. While most updates don’t have a large impact on performance, a few updates have increased (or even decreased) browser performance by a significant amount. For this reason, it’s always important to record and disclose the extended browser version number for each test run. The same principle applies to any other relevant software.
  • Keep a thorough record of system information: We record detailed information about a test system’s key hardware and software components, including full model and version numbers. This information is not only important for later disclosure if we choose to publish a result, it can also sometimes help to pinpoint system differences that explain why two seemingly identical devices are producing very different scores. We also want people to be able to reproduce our results to the closest extent possible, so that commitment involves recording and disclosing more detail than you’ll find in some tech articles and product reviews.
  • Test with clean images: We typically use an out-of-box (OOB) method for testing new devices in the XPRT lab. OOB testing means that other than running the initial OS and browser version updates that users are likely to run after first turning on the device, we change as little as possible before testing. We want to assess the performance that buyers are likely to see when they first purchase the device and before they install additional software. This is the best way to provide an accurate assessment of the performance retail buyers will experience from their new devices. That said, the OOB method is not appropriate for certain types of testing, such as when you want to compare as close to identical system images as possible, or when you want to remove as much pre-loaded software as possible.
  • Turn off automatic updates: We do our best to eliminate or minimize app and system updates after initial setup. Some vendors are making it more difficult to turn off updates completely, but you should always double-check update settings before testing.
  • Get a baseline for system processes: Depending on the system and the OS, a significant amount of system-level activity can be going on in the background after you turn it on. As much as possible, we like to wait for a stable baseline (idle time) of system activity before kicking off a test. If we start testing immediately after booting the system, we often see higher variance in the first run before the scores start to tighten up.
  • Use more than one data point: Because of natural variance, our standard practice in the XPRT lab is to publish a score that represents the median from three to five runs, if not more. If you run a benchmark only once and the score differs significantly from other published scores, your result could be an outlier that you would not see again under stable testing conditions or over the course of multiple runs.


We hope these tips will help make your testing more accurate. If you have any questions about WebXPRT, the other XPRTs, or benchmarking in general, feel free to ask!

Justin

XPRTs in the press

Each month, we send a newsletter to members of the BenchmarkXPRT Development Community. In the newsletter, we recap the latest updates from the XPRT world and provide a summary of the previous month’s XPRT-related activity, including uses or mentions of the XPRTs in the tech press. More people read the weekly XPRT blog than receive the monthly newsletter, so we realized that some blog readers may be unaware of the wide variety of tech outlets that regularly use or mention the XPRTs.

So for today’s blog, we want to give readers a sampling of the XPRT press usage we see on a weekly basis. Recent mentions include:

  • Tom’s Guide used HDXPRT 4 to compare the performance of the Geekom Mini IT8 and Dell OptiPlex 7090 Ultra small-form-factor PCs.
  • Intel used WebXPRT 4 test data in promotional material for their line of 12th Gen) Intel Core processors(Alder Lake). Hundreds of press outlets then republished the presentation.
  • AnandTech used WebXPRT 4 to evaluate the Cincoze DS-1300 Industrial PC.
  • ZDNet used CrXPRT 2 in a review titled The best Chromebooks for students: Student-proof laptops.
  • PCWorld used CrXPRT 2 to provide data for an article listing their top Chromebook recommendations.
  • TechPowerUp used WebXPRT 3 to compare the browser performance of Intel Core i9-12900KS processor-based systems and other Intel- and AMD processor-based systems.
  • Other outlets that have published articles, ads, or reviews mentioning the experts in the last few months include: Android Authority, ASUS, BenchLife, Gadgets 360, Good Gear Guide, Hardware.info, Hot Hardware, ITHardware (Poland), ITMedia (Japan), Itndaily (Russia), Mobile01.com (China), Notebookcheck, PCMag, ProClockers, Sohu.com (China), Tom’s Hardware, and Tweakers.

If you don’t currently receive the monthly BenchmarkXPRT newsletter, but would like to join the mailing list, please let us know! We will not publish or sell any of the contact information you provide, and will only send the monthly newsletter and occasional benchmark-related announcements such as patch notifications or new benchmark releases.

Justin

The WebXPRT 4 Preview is here!

We’re excited to announce that the WebXPRT 4 Preview is now available! Testers can access the Preview at www.WebXPRT4.com or through a link on WebXPRT.com. The Preview is available to everyone, and testers can now publish scores from Preview build testing. We may still tweak a few things, but will limit any changes that we make between the Preview and the final release to the UI and features we do not expect to affect test scores.

Longtime WebXPRT users will notice that the WebXPRT 4 Preview has a new, but familiar, UI. The general process for kicking off both manual and automated tests is the same as with WebXPRT 3, so the transition from WebXPRT 3 to WebXPRT 4 testing should be straightforward. We encourage everyone to visit the XPRT blog for more details about what’s new in this Preview release.

In addition, keep your eye on the blog for more details about the all-new WebXPRT 4 results viewer, which we expect to publish in the very near future. We think WebXPRT testers will enjoy using the viewer to explore our WebXPRT 4 test data!

After you try the WebXPRT 4 Preview, please send us your comments. Thanks and happy testing!

Justin

It’s time to shop for the holidays, and the XPRTs are here to help!

The holiday season is fast approaching, and with widespread product shortages and supply chain interruptions in the tech industry, it’s wise to start your holiday shopping now. If you’re considering phones, tablets, Chromebooks, or laptops as gifts, but are unsure where to get reliable information about the devices, the XPRTs can help!

One of the core functions of the XPRTs is to cut through the marketing noise by providing objective, reliable measures of a device’s performance. For example, instead of trying to guess whether a new Chromebook is fast enough to handle the demands of remote learning, you can use its CrXPRT and WebXPRT performance scores to see how it stacks up against the competition on everyday tasks.

A good place to start looking for device scores is our XPRT results browser, which lets you access our database of more than 2,800 test results from over 110 sources, including major tech review publications around the world, OEMs, and independent testers. You can find a wealth of current and historical performance data across all the XPRT benchmarks and hundreds of devices. Learn how to use the results browser here.

If you’re considering a popular device, chances are good that a recent tech review includes an XPRT score for that device. You can find these reviews by going to your favorite tech review site and searching for “XPRT,” or entering the name of the device and the appropriate XPRT (e.g., “Apple iPad” and “WebXPRT”) in a search engine. Here are a few recent tech reviews that use one or more of the XPRTs to evaluate popular devices:

The XPRTs can help consumers make better-informed and more confident tech purchases this holiday season, and we hope you’ll find the data you need on our site or in an XPRT-related tech review. If you have any questions about the XPRTs, XPRT scores, or the results database please feel free to ask!

Justin

Using WebXPRT 3 to compare the performance of popular browsers (Round 3)

In November, we published our WebXPRT 3 browser performance comparison, so we decided it was time to see if the performance rankings of popular browsers have changed in the last nine months.

For this round of tests, we used the same laptop as last time: a Dell XPS 13 7930 with an Intel Core i3-10110U processor and 4 GB of RAM running Windows 10 Home, updated to version 1909 (18363.1556). We installed all current Windows updates and tested on a clean system image. After the update process completed, we turned off updates to prevent them from interfering with test runs. We ran WebXPRT 3 three times each on five browsers: Brave, Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. For each browser, the score we post below is the median of the three test runs.

In our last round of tests, the four Chromium-based browsers (Brave, Chrome, Edge, and Opera) produced very close scores, though we saw about a four percent lower score from Brave. In this round of testing, performance improved for all four of the Chromium-based browsers. Chrome, Edge, and Opera still produced very close scores, but Brave’s performance still lagged, this time by about seven percent.

Firefox separated itself from the pack with a much higher score and has been the clear winner in all three rounds of testing. During our second round of testing in November, every browser except for Chrome saw slightly slower performance than the first round. In these latest tests, all the Chromium-based browsers produced significantly higher scores than the second round. When discussing browser performance, it’s important to remember that there are many possible reasons for these performance changes—including changes in browser overhead or changes in Windows—and most users may not notice the changes during everyday tasks.

Do these results mean that Mozilla Firefox will always provide you with a speedier web experience? As we noted in previous comparisons, a device with a higher WebXPRT score will probably feel faster during daily use than one with a lower score. For comparisons on the same system, however, the answer depends on several factors, such as the types of things you do on the web, how the extensions you’ve installed affect performance, how frequently the browsers issue updates and incorporate new web technologies, and how accurately each browser’s default installation settings reflect how you would set up that browser for your daily workflow.

In addition, browser speed can increase or decrease significantly after an update, only to swing back in the other direction shortly thereafter. OS-specific optimizations can also affect performance, such as with Edge on Windows 10 or Chrome on Chrome OS. All these variables are important to keep in mind when considering how browser performance comparison results translate to your everyday experience. Do you have insights you’d like to share from using WebXPRT to compare browser performance? Let us know!

Justin

Check out the other XPRTs:

Forgot your password?