BenchmarkXPRT Blog banner

Author Archives: Justin Greene

The XPRT Spotlight Back-to-School Roundup

Today, we’re pleased to announce our second annual XPRT Spotlight Back-to-School Roundup, a free shopping tool that provides side-by-side comparisons of this school year’s most popular Chromebooks, laptops, tablets, and convertibles. We designed the Roundup to help buyers choosing devices for education, such as college students picking out a laptop or school administrators deciding on the devices for a grade. The Roundup can help make those decisions easier by gathering the product and performance facts these buyers need in one convenient place.

We tested the Roundup devices in our lab using the XPRT suite of benchmark tools. In addition to benchmark results, we also provide photographs, device specs, and prices.

If you haven’t yet visited the XPRT Weekly Tech Spotlight page, check it out. Every week, the Spotlight highlights a new device, making it easier for consumers to shop for a new laptop, smartphone, tablet, or PC. Recent devices in the spotlight include the Samsung Chromebook Pro, Microsoft Surface Laptop, Microsoft Surface Pro, OnePlus 5, and Apple iPad Pro 10.5”.

Vendors interested in having their devices featured in the XPRT Weekly Tech Spotlight or next year’s Roundup can visit the website for more details.

We’re always working on ways to make the Spotlight an even more powerful tool for helping with buying decisions. If you have any ideas for the page or suggestions for devices you’d like to see, let us know!

Justin

Best practices

Recently, a tester wrote in and asked for help determining why they were seeing different WebXPRT scores on two tablets with the same hardware configuration. The scores differed by approximately 7.5 percent. This can happen for many reasons, including different software stacks, but score variability can also result from different testing behavior and environments. While some degree of variability is natural, the question provides us with a great opportunity to talk about the basic benchmarking practices we follow in the XPRT lab, practices that contribute to the most consistent and reliable scores.

Below, we list a few basic best practices you might find useful in your testing. While they’re largely in the context of the WebXPRT focus on evaluating browser performance, several of these practices apply to other benchmarks as well.

  • Test with clean images: We use an out-of-box (OOB) method for testing XPRT Spotlight devices. OOB testing means that other than initial OS and browser version updates that users are likely to run after first turning on the device, we change as little as possible before testing. We want to assess the performance that buyers are likely to see when they first purchase the device, before installing additional apps and utilities. This is the best way to provide an accurate assessment of the performance retail buyers will experience. While OOB is not appropriate for certain types of testing, the key is to not test a device that’s bogged down with programs that influence results unnecessarily.
  • Turn off updates: We do our best to eliminate or minimize app and system updates after initial setup. Some vendors are making it more difficult to turn off updates completely, but you should always account for update settings.
  • Get a feel for system processes: Depending on the system and the OS, quite a lot of system-level activity can be going on in the background after you turn it on. As much as possible, we like to wait for a stable baseline (idle) of system activity before kicking off a test. If we start testing immediately after booting the system, we often see higher variability in the first run before the scores start to tighten up.
  • Disclosure is not just about hardware: Most people know that different browsers will produce different performance scores on the same system. However, testers aren’t always aware of shifts in performance between different versions of the same browser. While most updates don’t have a large impact on performance, a few updates have increased (or even decreased) browser performance by a significant amount. For this reason, it’s always worthwhile to record and disclose the extended browser version number for each test run. The same principle applies to any other relevant software.
  • Use more than one data point: Because of natural variability, our standard practice in the XPRT lab is to publish a score that represents the median from at least three to five runs. If you run a benchmark only once, and the score differs significantly from other published scores, your result could be an outlier that you would not see again under stable testing conditions.


We hope those tips will make testing a little easier for you. If you have any questions about the XPRTs, or about benchmarking in general, feel free to ask!

Justin

MobileXPRT: evaluate the performance of your Android device

We recently discussed the capabilities and benefits of TouchXPRT, CrXPRT, BatteryXPRT, and HDXPRT. This week, we’re focusing on MobileXPRT, an app that evaluates how well an Android device handles everyday tasks. Like the other XPRT family benchmarks, MobileXPRT is easy to use. It takes less than 15 minutes to run on most devices, runs relatable workloads, and delivers reliable, objective, and easy-to-understand results.

MobileXPRT includes five performance scenarios (Apply Photo Effects, Create Photo Collages, Create Slideshow, Encrypt Personal Content, and Detect Faces to Organize Photos). By default, the benchmark runs all five tasks and reports individual workload scores and an overall performance score.

MobileXPRT 2015 is the latest version of the app, supporting both 32-bit and 64-bit hardware running Android 4.4 or higher. To test systems running older versions of Android, or to test 32-bit performance on a 64-bit system, you can use MobileXPRT 2013. The results of the two versions are comparable.

MobileXPRT is a useful tool for anyone who wants to compare the performance capabilities of Android phones or tablets. To see test results from a variety of systems, go to MobileXPRT.com and click View Results, where you’ll find scores from many different Android devices.

If you’d like to run MobileXPRT:

Simply download MobileXPRT from MobileXPRT.com or the Google Play Store. The full installer package on MobileXPRT.com, containing both app and test data, is 243 MB. You may also use this link to download the 18 MB MobileXPRT app file, which will download the test data during installation. The MobileXPRT user manual provides instructions for configuring your device and kicking off a test.

If you’d like to dig into the details:

Check out the Exploring MobileXPRT 2015 white paper. In it, we discuss the MobileXPRT development process and details of the individual performance scenarios. We also explain exactly how the benchmark calculates results.

If you’d like to dig even deeper, the MobileXPRT source code is available to members of the BenchmarkXPRT Development Community, so consider joining today. Membership is free for members of any company or organization with an interest in benchmarks, and there are no obligations after joining.

If you haven’t used MobileXPRT before, give it a shot and let us know what you think!

Justin

Planning the next version of HDXPRT

A few weeks ago, we wrote about the capabilities and benefits of HDXPRT. This week, we want to share some initial ideas for the next version of HDXPRT, and invite you to send us any comments or suggestions you may have.

The first step towards a new HDXPRT will be updating the benchmark’s workloads to increase their value in the years to come. Primarily, this will involve updating application content, such as photos and videos, to more contemporary file resolutions and sizes. We think 4K-related workloads will increase the benchmark’s relevance, but aren’t sure whether 4K playback tests are necessary. What do you think?

The next step will be to update versions of the real-world trial applications included in the benchmark, including Adobe Photoshop Elements, Apple iTunes, Audacity, CyberLink MediaEspresso, and HandBrake. Are there other any applications you feel would be a good addition to HDXPRT’s editing photos, editing music, or converting videos test scenarios?

We’re also planning to update the UI to improve the look and feel of the benchmark and simplify navigation and functionality.

Last but not least, we’ll work to fix known problems, such as the hardware acceleration settings issue in MediaEspresso, and eliminate the need for workarounds when running HDXPRT on the Windows 10 Creators Update.

Do you have feedback on these ideas or suggestions for applications or test scenarios that we should consider for HDXPRT? Are there existing features we should remove? Are there elements of the UI that you find especially useful or would like to see improved? Please let us know. We want to hear from you and make sure that HDXPRT continues to meet your needs.

Justin

XPRTs around the world

This week, we posted an updated version of our “XPRTs around the world” infographic. From time to time, we like to give readers a broader view of the impact that the XPRTs are having around the world, and the infographic shows just how far and wide the XPRTs’ reach has grown.

Here are some numbers from the latest update:

  • The XPRTs have been mentioned more than 7,800 times on over 2,500 unique sites.
  • Those mentions include more than 6,800 articles and reviews.
  • Those mentions originated in over 400 cities located in 58 countries on six continents. If you’re a tech reviewer based in Antarctica, we’re counting on you to help us make it a clean sweep!
  • The BenchmarkXPRT Development Community now includes 203 members from 73 companies and organizations around the world.


In addition to the reach numbers, we’re excited that the XPRTs have now delivered more than 250,000 real-world results!

If you’re familiar with the run counter on WebXPRT.com, you may have noticed that the WebXPRT run tally is rising quickly. Starting with the original release of WebXPRT in early 2013, it took more than three and a half years for the combined run tally of WebXPRT 2013 and WebXPRT 2015 to reach 100,000. In the nine months since that happened, users have added 60,000 runs. The pace is picking up significantly!

We’re grateful for everyone who’s helped us get this far. Here’s to another quarter-million runs and downloads!

Justin

HDXPRT: see how your Windows PC handles media tasks

Over the last several weeks, we reminded readers of the capabilities and benefits of TouchXPRT, CrXPRT, and BatteryXPRT. This week, we’d like to highlight HDXPRT. HDXPRT, which stands for High Definition Experience & Performance Ratings Test, was the first benchmark published by the HDXPRT Development Community, which later became the BenchmarkXPRT Development Community. HDXPRT evaluates the performance of Windows devices while handling real-world media tasks such as photo editing, video conversion, and music editing, all while using real commercial applications, including Photoshop and iTunes. HDXPRT presents results that are relevant and easy to understand.

We originally distributed HDXPRT on installation DVDs, but HDXPRT 2014, the latest version, is available for download from HDXPRT.com. HDXPRT 2014 is for systems running Windows 8.1 and later. The benchmark takes about 10 minutes to install, and a run takes less than two hours.

HDXPRT is a useful tool for anyone who wants to evaluate the real-world, content-creation capabilities of a Windows PC. To see test results from a variety of systems, go to HDXPRT.com and click View Results, where you’ll find scores from many different Windows devices.

If you’d like to run HDXPRT:

Simply download HDXPRT from HDXPRT.com. The HDXPRT user manual provides information on minimum system requirements, as well as step-by-step instructions for how to configure your system and kick off a test. Testers running HDXPRT on Windows 10 Creators Update builds should consult the tech support note posted on HDXPRT.com.

If you’d like to dig into the details:

Check out the Exploring HDXPRT 2014 white paper. In it, we discuss the benchmark’s three test scenarios in detail and show how we calculate the results.

If you’d like to dig even deeper, the HDXPRT source code is available to members of the BenchmarkXPRT Development Community, so consider joining today. Membership is free for members of any company or organization with an interest in benchmarks, and there are no obligations after joining.

If you haven’t used HDXPRT before, give it a shot and let us know what you think!

On another note, Bill will be attending Mobile World Congress in Shanghai next week. Let us know if you’d like to meet up and discuss the XPRTs or how to get your device in the XPRT Spotlight.

Justin

Check out the other XPRTs:

Forgot your password?