BenchmarkXPRT Blog banner

Category: Browser-based benchmarks

The versatility of XPRT benchmarks

We’ve designed each of the XPRT benchmarks to assess the performance of specific types of devices in scenarios that mirror the ways consumers typically use those devices. While most XPRT benchmark users are interested in producing official overall scores, some members of the tech press have been using the XPRTs in unconventional, creative ways.

One example is the use of WebXPRT by Tweakers, a popular tech review site based in The Netherlands. (The site is in Dutch, so the Google Translate extension in Chrome was helpful for me.) As Tweakers uses WebXPRT to evaluate all kinds of consumer hardware, they also measure the sound output of each device. Tweakers then publishes the LAeq metric for each device, giving readers a sense of how loud a system may be, on average, while it performs common browser tasks.

If you’re interested in seeing Tweakers’ use of WebXPRT for sound output testing firsthand, check out their Apple MacBook Pro M2, HP Envy 34 All-in-One, and Samsung Galaxy Book 2 Pro reviews.

Other labs and tech publications have also used the XPRTs in unusual ways such as automating the benchmarks to run during screen burn-in tests or custom battery-life rundowns. If you’ve used any of the XPRT benchmarks in creative ways, please let us know! We are interested in learning more about your tests, and your experiences may provide helpful information that we can share with other XPRT users.

Justin

The WebXPRT 4 results calculation white paper is now available

Last week, we published the Exploring WebXPRT 4 white paper. The paper describes the design and structure of WebXPRT 4, including detailed information about the benchmark’s harness, HTML5 and WebAssembly capability checks, and the structure of the performance test workloads. This week, to help WebXPRT 4 testers understand how the benchmark calculates results, we’ve published the WebXPRT 4 results calculation and confidence interval white paper.

The white paper explains the WebXPRT 4 confidence interval and how it differs from typical benchmark variability, and the formulas the benchmark uses to calculate the individual workload scenario scores and overall score. The paper also provides an overview of the statistical techniques WebXPRT uses to translate raw timings into scores.

To supplement the white paper’s discussion of the results calculation process, we’ve also published a results calculation spreadsheet that shows the raw data from a sample test run and reproduces the calculations WebXPRT uses to produce workload scores and the overall score.

The paper is available on WebXPRT.com and on our XPRT white papers page. If you have any questions about the WebXPRT results calculation process, please let us know!

Justin

The XPRTs can help with your back-to-school shopping

The new school year is upon us, and learners of all ages are looking for tech devices that have the capabilities they will need in the coming year. The tech marketplace can be confusing, and competing claims can be hard to navigate. The XPRTs are here to help! Whether you’re shopping for a new phone, tablet, Chromebook, laptop, or desktop, the XPRTs can provide reliable, industry-trusted performance scores that can cut through all the noise.

A good place to start looking for scores is the WebXPRT 4 results viewer. The viewer displays WebXPRT 4 scores from over 175 devices—including many hot new releases—and we’re adding new scores all the time. To learn more about the viewer’s capabilities and how you can use it to compare devices, check out this blog post.

Another resource we offer is the XPRT results browser. The browser is the most efficient way to access the XPRT results database, which currently holds more than 3,000 test results from over 120 sources, including major tech review publications around the world, OEMs, and independent testers. It offers a wealth of current and historical performance data across all of the XPRT benchmarks and hundreds of devices. You can read more about how to use the results browser here.

Also, if you’re considering a popular device, chances are good that a recent tech review includes an XPRT score for that device. Two quick ways to find these reviews: (1) go to your favorite tech review site and search for “XPRT” and (2) go to a search engine and enter the device name and XPRT name (e.g., “Apple MacBook Air” and “WebXPRT”). Here are a few recent tech reviews that use one of the XPRTs to evaluate a popular device:

The XPRTs can help consumers make better-informed and more confident tech purchases. As this school year begins, we hope you’ll find the data you need on our site or in an XPRT-related tech review. If you have any questions about the XPRTs, XPRT scores, or the results database please feel free to ask!

Justin

We’ve moved WebXPRT Singapore to a new hosting environment

When we first released WebXPRT 2013, some users in mainland China reported slow download times when running the benchmark. In response, we set up a mirror host site in Singapore to facilitate WebXPRT testing in China and other East Asian countries. We continued this practice with subsequent WebXPRT versions, and currently offer Singapore-based instances of WebXPRT 4, WebXPRT 3, and WebXPRT 2015.

Until this past month, we used an Amazon Web Services (AWS) EC2-Classic environment to host the Singapore mirror site. Because Amazon retired the EC2-Classic environment, we had to migrate each of the WebXPRT Singapore instances to a new AWS Virtual Private Cloud (VPC) environment.  

We do not expect the new environment to affect WebXPRT Singapore testing or results, and have not yet observed any significant differences in WebXPRT performance scores while testing on the new site. If you have a different experience when testing on the new site or encounter interruptions when trying to access the test, please let us know!

Justin

WebXPRT passes the million-run milestone!

We’re excited to see that users have successfully completed over 1,000,000 WebXPRT runs! If you’ve run WebXPRT in any of the 924 cities and 81 countries from which we’ve received complete test data—including newcomers Bahrain, Bangladesh, Mauritius, The Philippines, and South Korea —we’re grateful for your help. We could not have reached this milestone without you!

As the chart below illustrates, WebXPRT use has grown steadily since the debut of WebXPRT 2013. On average, we now record more WebXPRT runs in one month than we recorded in the entirety of our first year. With over 104,000 runs so far in 2022, that growth is continuing.

For us, this moment represents more than a numerical milestone. Developing and maintaining a benchmark is never easy, and a cross-platform benchmark that will run on a wide variety of devices poses an additional set of challenges. For such a benchmark to succeed, developers need not only technical competency, but the trust and support of the benchmarking community. WebXPRT is now in its ninth year, and its consistent year-over-year growth tells us that the benchmark continues to hold value for manufacturers, OEM labs, the tech press, and end users like you. We see it as a sign of trust that folks repeatedly return to the benchmark for reliable performance metrics. We’re grateful for that trust, and for everyone that’s contributed to the WebXPRT development process throughout the years.

We’ll have more to share related to this exciting milestone in the weeks to come, so stay tuned to the blog. If you have any questions or comments about WebXPRT, we’d love to hear from you!

Justin

Helpful tips for WebXPRT 4 results submission

Back in March, we discussed the WebXPRT 4 results submission process and reminded readers that everyone who runs a WebXPRT 4 test is welcome to submit scores for us to consider for publication in the WebXPRT 4 results viewer. Unlike sites that publish every result that users submit, we publish only results that meet our evaluation criteria. Among other things, scores must be consistent with general expectations and must include enough detailed system information to help us assess whether individual scores represent valid test runs. Today, we offer a couple of tips to increase the likelihood that we will publish your WebXPRT 4 test results.

Tip 1: Specify your system’s processor

While testers usually include detailed information for the device, model number, operating system, and browser version fields, we receive many submissions with little to no information about the test system’s processor.

In the picture below, you can see an example of the level of detail that we require to consider a submission. We need the full processor name, including the manufacturer and model number (e.g., Intel Core i9-9980HK, AMD Ryzen 3 1300X, or Apple M1 Max). Note that we do not require the processor speed reported by the system.

Tip 2: Include a valid email address

It is also common for submissions to not include a valid email address. While we understand the privacy concerns related to submitting a personal or corporate email address, we need a valid address that we can use as a point of contact to confirm test-related information when necessary. We don’t use those addresses for any other purposes, such as selling them, sharing them with any third parties, or adding them to a mailing list.

We hope this information explains why we might not have published your results. We look forward to receiving your future score submissions. If you have any questions about the submission process, please let us know!

Justin

Check out the other XPRTs:

Forgot your password?