BenchmarkXPRT Blog banner

Search Results for: webxprt

XPRTs around the world

This week, we posted an updated version of our “XPRTs around the world” infographic. From time to time, we like to give readers a broader view of the impact that the XPRTs are having around the world, and the infographic shows just how far and wide the XPRTs’ reach has grown.

Here are some numbers from the latest update:

  • The XPRTs have been mentioned more than 7,800 times on over 2,500 unique sites.
  • Those mentions include more than 6,800 articles and reviews.
  • Those mentions originated in over 400 cities located in 58 countries on six continents. If you’re a tech reviewer based in Antarctica, we’re counting on you to help us make it a clean sweep!
  • The BenchmarkXPRT Development Community now includes 203 members from 73 companies and organizations around the world.


In addition to the reach numbers, we’re excited that the XPRTs have now delivered more than 250,000 real-world results!

If you’re familiar with the run counter on WebXPRT.com, you may have noticed that the WebXPRT run tally is rising quickly. Starting with the original release of WebXPRT in early 2013, it took more than three and a half years for the combined run tally of WebXPRT 2013 and WebXPRT 2015 to reach 100,000. In the nine months since that happened, users have added 60,000 runs. The pace is picking up significantly!

We’re grateful for everyone who’s helped us get this far. Here’s to another quarter-million runs and downloads!

Justin

Another pronunciation lesson

Knowing how to say the terms we read on-line can be a bit of a mystery. For example, it’s been 30 years since CompuServe created the GIF format, and people are still arguing about how to say it.

A couple of months ago, we talked about how to pronounce WebXPRT. In the video we pointed to, the narrator openly says he’s confused about how to say “XPRT.” For the record, it’s pronounced “expert.”

Recently, we came across another video, which referred to CrXPRT. The narrator pronounced it “Chrome expert.” The “expert” part is correct, but the “Chrome” part is not. It’s an understandable mistake, because Cr is the chemical symbol for chromium. That’s why we chose it! However, we pronounce the C and R individually. So, the name is said “C R expert.”

All that being said, it was great to see CrXPRT in the classroom! When we created CrXPRT, the education market was a big consideration, as you can tell from this CrXPRT video. We love seeing the XPRTs in the real world!

Eric

Getting to know TouchXPRT

Many of our community members first encountered the XPRTs when reading about WebXPRT or MobileXPRT in a device review, using TouchXPRT or HDXPRT in an OEM lab, or using BatteryXPRT or CrXPRT to evaluate devices for bulk purchasing on behalf of a corporation or government agency. They know that specific XPRT provided great value in that context, but may not know about the other members of the XPRT family.

To help keep folks up to date on the full extent of XPRT capabilities, we like to occasionally “reintroduce” each of the XPRTs. This week, we invite you to get to know TouchXPRT.

We developed TouchXPRT 2016 as a Universal Windows Platform app for Windows 10. We wanted to offer a free tool that would provide consumers with objective information about how well a Windows 10 or Windows 10 Mobile laptop, tablet, or phone handles common media tasks. To do this, TouchXPRT runs five tests that simulate the kinds of photo, video, and music editing tasks people do every day. It measures how quickly the device completes each of those tasks and provides an overall score. To compare device scores, go to TouchXPRT.com and click View Results, where you’ll find scores from many different Windows 10 and Windows 10 Mobile devices.

TouchXPRT is easy to install and run, and is a great resource for anyone who wants to evaluate the performance of a Windows 10 device.

If you’d like to run TouchXPRT:

Simply download TouchXPRT from the Microsoft Store. (If that doesn’t work for you, you can also download it directly from TouchXPRT.com.) Installing it should take about 15 minutes, and the TouchXPRT 2016 release notes provide step-by-step instructions.

If you’d like to dig into the details:

Check out the Exploring TouchXPRT 2016 white paper. In it, we discuss the TouchXPRT development process, its component tests and workloads, and how it calculates individual workload and overall scores. We also provide instructions for automated testing.

BenchmarkXPRT Development Community members also have access to the TouchXPRT source code, so consider joining today. There’s no obligation and membership is free for members of any company or organization with an interest in benchmarks.

If you haven’t tried running TouchXPRT before, give it a shot and let us know what you think!

Justin

Evolve or die

Last week, Google announced that it would retire its Octane benchmark. Their announcement explains that they designed Octane to spur improvement in JavaScript performance, and while it did just that when it was first released, those improvements have plateaued in recent years. They also note that there are some operations in Octane that optimize Octane scores but do not reflect real-world scenarios. That’s unfortunate, because they, like most of us, want improvements in benchmark scores to mean improvements in end-user experience.

WebXPRT comes at the web performance issue differently. While Octane’s goal was to improve JavaScript performance, the purpose of WebXPRT is to measure performance from the end user’s perspective. By doing the types of work real people do, WebXPRT doesn’t measure only improvements in JavaScript performance; it also measures the quality of the real-world user experience. WebXPRT’s results also reflect the performance of the entire device and software stack, not just the performance of the JavaScript interpreter.

Google’s announcement reminds us that benchmarks have finite life spans, that they must constantly evolve to keep pace with changes in technology, or they will become useless. To make sure the XPRT benchmarks do just that, we are always looking at how people use their devices and developing workloads that reflect their actions. This is a core element of the XPRT philosophy.

As we mentioned last week, we’ve working on the next version of WebXPRT. If you have any thoughts about how it should evolve, let us know!

Eric

Running Android-oriented XPRTs on Chrome OS

Since last summer, we’ve been following Google’s progress in bringing Android apps and the Google Play store to Chromebooks, along with their plan to gradually phase out support for Chrome apps over the next few years. Because we currently offer apps that assess battery life and performance for Android devices (BatteryXPRT and MobileXPRT) and Chromebooks (CrXPRT), the way this situation unfolds could affect the makeup of the XPRT portfolio in the years to come.

For now, we’re experimenting to see how well the Android app/Chrome OS merger is working with the devices in our lab. One test case is the Samsung Chromebook Plus, which we featured in the XPRT Weekly Tech Spotlight a few weeks ago. Normally, we would publish only CrXPRT and WebXPRT results for a Chromebook, but installing and running MobileXPRT 2015 from the Google Play store was such a smooth and error-free process that we decided to publish the first MobileXPRT score for a device running Chrome OS.

We also tried running BatteryXPRT on the Chromebook Plus, but even though the installation was quick and easy and the test kicked off without a hitch, we could not generate a valid result. Typically, the test would complete several iterations successfully, but terminate before producing a result. We’re investigating the problem, and will keep the community up to date on what we find.

In the meantime, we continue to recommend that Chromebook testers use CrXPRT for performance and battery life assessment. While we haven’t encountered any issues running MobileXPRT 2015 on Chromebooks, CrXPRT has a proven track record.

If you have any questions about running Android-oriented XPRTs on Chrome OS, or insights that you’d like to share, please let us know.

Justin

Digging deeper

From time to time, we like to revisit the fundamentals of the XPRT approach to benchmark development. Today, we’re discussing the need for testers and benchmark developers to consider the multiple factors that influence benchmark results. For every device we test, all of its hardware and software components have the potential to affect performance, and changing the configuration of those components can significantly change results.

For example, we frequently see significant performance differences between different browsers on the same system. In our recent recap of the XPRT Weekly Tech Spotlight’s first year, we highlighted an example of how testing the same device with the same benchmark can produce different results, depending on the software stack under test. In that instance, the Alienware Steam Machine entry included a WebXPRT 2015 score for each of the two browsers that consumers were likely to use. The first score (356) represented the SteamOS browser app in the SteamOS environment, and the second (441) represented the Iceweasel browser (a Firefox variant) in the Linux-based desktop environment. Including only the first score would have given readers an incomplete picture of the Steam Machine’s web-browsing capabilities, so we thought it was important to include both.

We also see performance differences between different versions of the same browser, a fact especially relevant to those who use frequently updated browsers, such as Chrome. Even benchmarks that measure the same general area of performance, for example, web browsing, are usually testing very different things.

OS updates can also have an impact on performance. Consumers might base a purchase on performance or battery life scores and end up with a device that behaves much differently when updated to a new version of Android or iOS, for example.

Other important factors in the software stack include pre-installed software, commonly referred to as bloatware, and the proliferation of apps that sap performance and battery life.

This is a much larger topic than we can cover in the blog. Let the examples we’ve mentioned remind you to think critically about, and dig deeper into, benchmark results. If we see published XPRT scores that differ significantly from our own results, our first question is always “What’s different between the two devices?” Most of the time, the answer becomes clear as we compare hardware and software from top to bottom.

Justin

Check out the other XPRTs:

Forgot your password?