BenchmarkXPRT Blog banner

Category: WebXPRT

Best practices

Recently, a tester wrote in and asked for help determining why they were seeing different WebXPRT scores on two tablets with the same hardware configuration. The scores differed by approximately 7.5 percent. This can happen for many reasons, including different software stacks, but score variability can also result from different testing behavior and environments. While some degree of variability is natural, the question provides us with a great opportunity to talk about the basic benchmarking practices we follow in the XPRT lab, practices that contribute to the most consistent and reliable scores.

Below, we list a few basic best practices you might find useful in your testing. While they’re largely in the context of the WebXPRT focus on evaluating browser performance, several of these practices apply to other benchmarks as well.

  • Test with clean images: We use an out-of-box (OOB) method for testing XPRT Spotlight devices. OOB testing means that other than initial OS and browser version updates that users are likely to run after first turning on the device, we change as little as possible before testing. We want to assess the performance that buyers are likely to see when they first purchase the device, before installing additional apps and utilities. This is the best way to provide an accurate assessment of the performance retail buyers will experience. While OOB is not appropriate for certain types of testing, the key is to not test a device that’s bogged down with programs that influence results unnecessarily.
  • Turn off updates: We do our best to eliminate or minimize app and system updates after initial setup. Some vendors are making it more difficult to turn off updates completely, but you should always account for update settings.
  • Get a feel for system processes: Depending on the system and the OS, quite a lot of system-level activity can be going on in the background after you turn it on. As much as possible, we like to wait for a stable baseline (idle) of system activity before kicking off a test. If we start testing immediately after booting the system, we often see higher variability in the first run before the scores start to tighten up.
  • Disclosure is not just about hardware: Most people know that different browsers will produce different performance scores on the same system. However, testers aren’t always aware of shifts in performance between different versions of the same browser. While most updates don’t have a large impact on performance, a few updates have increased (or even decreased) browser performance by a significant amount. For this reason, it’s always worthwhile to record and disclose the extended browser version number for each test run. The same principle applies to any other relevant software.
  • Use more than one data point: Because of natural variability, our standard practice in the XPRT lab is to publish a score that represents the median from at least three to five runs. If you run a benchmark only once, and the score differs significantly from other published scores, your result could be an outlier that you would not see again under stable testing conditions.


We hope those tips will make testing a little easier for you. If you have any questions about the XPRTs, or about benchmarking in general, feel free to ask!

Justin

MobileXPRT: evaluate the performance of your Android device

We recently discussed the capabilities and benefits of TouchXPRT, CrXPRT, BatteryXPRT, and HDXPRT. This week, we’re focusing on MobileXPRT, an app that evaluates how well an Android device handles everyday tasks. Like the other XPRT family benchmarks, MobileXPRT is easy to use. It takes less than 15 minutes to run on most devices, runs relatable workloads, and delivers reliable, objective, and easy-to-understand results.

MobileXPRT includes five performance scenarios (Apply Photo Effects, Create Photo Collages, Create Slideshow, Encrypt Personal Content, and Detect Faces to Organize Photos). By default, the benchmark runs all five tasks and reports individual workload scores and an overall performance score.

MobileXPRT 2015 is the latest version of the app, supporting both 32-bit and 64-bit hardware running Android 4.4 or higher. To test systems running older versions of Android, or to test 32-bit performance on a 64-bit system, you can use MobileXPRT 2013. The results of the two versions are comparable.

MobileXPRT is a useful tool for anyone who wants to compare the performance capabilities of Android phones or tablets. To see test results from a variety of systems, go to MobileXPRT.com and click View Results, where you’ll find scores from many different Android devices.

If you’d like to run MobileXPRT:

Simply download MobileXPRT from MobileXPRT.com or the Google Play Store. The full installer package on MobileXPRT.com, containing both app and test data, is 243 MB. You may also use this link to download the 18 MB MobileXPRT app file, which will download the test data during installation. The MobileXPRT user manual provides instructions for configuring your device and kicking off a test.

If you’d like to dig into the details:

Check out the Exploring MobileXPRT 2015 white paper. In it, we discuss the MobileXPRT development process and details of the individual performance scenarios. We also explain exactly how the benchmark calculates results.

If you’d like to dig even deeper, the MobileXPRT source code is available to members of the BenchmarkXPRT Development Community, so consider joining today. Membership is free for members of any company or organization with an interest in benchmarks, and there are no obligations after joining.

If you haven’t used MobileXPRT before, give it a shot and let us know what you think!

Justin

XPRTs around the world

This week, we posted an updated version of our “XPRTs around the world” infographic. From time to time, we like to give readers a broader view of the impact that the XPRTs are having around the world, and the infographic shows just how far and wide the XPRTs’ reach has grown.

Here are some numbers from the latest update:

  • The XPRTs have been mentioned more than 7,800 times on over 2,500 unique sites.
  • Those mentions include more than 6,800 articles and reviews.
  • Those mentions originated in over 400 cities located in 58 countries on six continents. If you’re a tech reviewer based in Antarctica, we’re counting on you to help us make it a clean sweep!
  • The BenchmarkXPRT Development Community now includes 203 members from 73 companies and organizations around the world.


In addition to the reach numbers, we’re excited that the XPRTs have now delivered more than 250,000 real-world results!

If you’re familiar with the run counter on WebXPRT.com, you may have noticed that the WebXPRT run tally is rising quickly. Starting with the original release of WebXPRT in early 2013, it took more than three and a half years for the combined run tally of WebXPRT 2013 and WebXPRT 2015 to reach 100,000. In the nine months since that happened, users have added 60,000 runs. The pace is picking up significantly!

We’re grateful for everyone who’s helped us get this far. Here’s to another quarter-million runs and downloads!

Justin

Another pronunciation lesson

Knowing how to say the terms we read on-line can be a bit of a mystery. For example, it’s been 30 years since CompuServe created the GIF format, and people are still arguing about how to say it.

A couple of months ago, we talked about how to pronounce WebXPRT. In the video we pointed to, the narrator openly says he’s confused about how to say “XPRT.” For the record, it’s pronounced “expert.”

Recently, we came across another video, which referred to CrXPRT. The narrator pronounced it “Chrome expert.” The “expert” part is correct, but the “Chrome” part is not. It’s an understandable mistake, because Cr is the chemical symbol for chromium. That’s why we chose it! However, we pronounce the C and R individually. So, the name is said “C R expert.”

All that being said, it was great to see CrXPRT in the classroom! When we created CrXPRT, the education market was a big consideration, as you can tell from this CrXPRT video. We love seeing the XPRTs in the real world!

Eric

Thinking ahead to WebXPRT 2017

A few months ago, Bill discussed our intention to update WebXPRT this year. Today, we want to share some initial ideas for WebXPRT 2017 and ask for your input.

Updates to the workloads provide an opportunity to increase the relevance and value of WebXPRT in the years to come. Here are a few of the ideas we’re considering:

  • For the Photo Enhancement workload, we can increase the data sizes of pictures. We can also experiment with additional types of photo enhancement such as background/foreground subtraction, collage creation, or panoramic/360-degree image viewing.
  • For the Organize Album workload, we can explore machine learning workloads by incorporating open source JavaScript libraries into web-based inferencing tests.
  • For the Local Notes workload, we’re investigating the possibility of leveraging natural-brain libraries for language processing functions.
  • For a new workload, we’re investigating the possibility of using online 3D modeling applications such as Tinkercad.

 
For the UI, we’re considering improvements to features like the in-test progress bars and individual subtest selection. We’re also planning to update the UI to make it visually distinct from older versions.

Throughout this process, we want to be careful to maintain the features that have made WebXPRT our most popular tool, with more than 141,000 runs to date. We’re committed to making sure that it runs quickly and simply in most browsers and produces results that are useful for comparing web browsing performance across a wide variety of devices.

Do you have feedback on these ideas or suggestions for browser technologies or test scenarios that we should consider for WebXPRT 2017? Are there existing features we should ditch? Are there elements of the UI that you find especially useful or would like to see improved? Please let us know. We want to hear from you and make sure that we’re crafting a performance tool that continues to meet your needs.

Justin

Running Android-oriented XPRTs on Chrome OS

Since last summer, we’ve been following Google’s progress in bringing Android apps and the Google Play store to Chromebooks, along with their plan to gradually phase out support for Chrome apps over the next few years. Because we currently offer apps that assess battery life and performance for Android devices (BatteryXPRT and MobileXPRT) and Chromebooks (CrXPRT), the way this situation unfolds could affect the makeup of the XPRT portfolio in the years to come.

For now, we’re experimenting to see how well the Android app/Chrome OS merger is working with the devices in our lab. One test case is the Samsung Chromebook Plus, which we featured in the XPRT Weekly Tech Spotlight a few weeks ago. Normally, we would publish only CrXPRT and WebXPRT results for a Chromebook, but installing and running MobileXPRT 2015 from the Google Play store was such a smooth and error-free process that we decided to publish the first MobileXPRT score for a device running Chrome OS.

We also tried running BatteryXPRT on the Chromebook Plus, but even though the installation was quick and easy and the test kicked off without a hitch, we could not generate a valid result. Typically, the test would complete several iterations successfully, but terminate before producing a result. We’re investigating the problem, and will keep the community up to date on what we find.

In the meantime, we continue to recommend that Chromebook testers use CrXPRT for performance and battery life assessment. While we haven’t encountered any issues running MobileXPRT 2015 on Chromebooks, CrXPRT has a proven track record.

If you have any questions about running Android-oriented XPRTs on Chrome OS, or insights that you’d like to share, please let us know.

Justin

Check out the other XPRTs:

Forgot your password?