BenchmarkXPRT Blog banner

Category: Benchmarking

Celebrating one year of the XPRT Weekly Tech Spotlight

It’s been just over a year since we launched the XPRT Weekly Tech Spotlight by featuring our first device, the Google Pixel C. Spotlight has since become one of the most popular items at BenchmarkXPRT.com, and we thought now would be a good time to recap the past year, offer more insight into the choices we make behind the scenes, and look at what’s ahead for Spotlight.

The goal of Spotlight is to provide PT-verified specs and test results that can help consumers make smart buying decisions. We try to include a wide variety of device types, vendors, software platforms, and price points in our inventory. The devices also tend to fall into one of two main groups: popular new devices generating a lot of interest and devices that have unique form factors or unusual features.

To date, we’ve featured 56 devices: 16 phones, 11 laptops, 10 two-in-ones, 9 tablets, 4 consoles, 3 all-in-ones, and 3 small-form-factor PCs. The operating systems these devices run include Android, ChromeOS, iOS, macOS, OS X, Windows, and an array of vendor-specific OS variants and skins.

As much as possible, we test using out-of-the-box (OOB) configurations. We want to present test results that reflect what everyday users will experience on day one. Depending on the vendor, the OOB approach can mean that some devices arrive bogged down with bloatware while others are relatively clean. We don’t attempt to “fix” anything in those situations; we simply test each device “as is” when it arrives.

If devices arrive with outdated OS versions (as is often the case with Chromebooks), we update to current versions before testing, because that’s the best reflection of what everyday users will experience. In the past, that approach would’ve been more complicated with Windows systems, but the Microsoft shift to “Windows as a service” ensures that most users receive significant OS updates automatically by default.

The OOB approach also means that the WebXPRT scores we publish reflect the performance of each device’s default browser, even if it’s possible to install a faster browser. Our goal isn’t to perform a browser shootout on each device, but to give an accurate snapshot of OOB performance. For instance, last week’s Alienware Steam Machine entry included two WebXPRT scores, a 356 on the SteamOS browser app and a 441 on Iceweasel 38.8.0 (a Firefox variant used in the device’s Linux-based desktop mode). That’s a significant difference, but the main question for us was which browser was more likely to be used in an OOB scenario. With the Steam Machine, the answer was truly “either one.” Many users will use the browser app in the SteamOS environment and many will take the few steps needed to access the desktop environment. In that case, even though one browser was significantly faster than the other, choosing to omit one score in favor of the other would have excluded results from an equally likely OOB environment.

We’re always looking for ways to improve Spotlight. We recently began including more photos for each device, including ones that highlight important form-factor elements and unusual features. Moving forward, we plan to expand Spotlight’s offerings to include automatic score comparisons, additional system information, and improved graphical elements. Most importantly, we’d like to hear your thoughts about Spotlight. What devices and device types would you like to see? Are there specs that would be helpful to you? What can we do to improve Spotlight? Let us know!

Justin

Mobile World Congress 2017 and the territories ahead

Walking the halls of this year’s Mobile World Congress (MWC)—and, once again, I walked by every booth in every one of them—it was clear that mobile technology is expanding faster than ever into more new tech territories than ever before.

On the device front, cameras and camera quality have become a pitched battleground, with mobile phone makers teaming with camera manufacturers to give us better and better images and video. This fight is far from over, too, because vendors are exploring many different ways to improve mobile phone camera quality. Quick charging is a hot new trend we can expect to hear more about in the days to come. Of course, apps and their performance continue to matter greatly, because if you can do it from any computer, you better be able to do at least some of it from your phone.

The Internet of Things (IoT) grabbed many headlines, with vendors still selling more dreams than reality, but some industries living this future now. The proliferation of IoT devices will result, of course, in massive increases in the amount of data flowing through the world’s networks, which in turn will require more and more computing power to analyze and use. That power will need to be everywhere, from massive datacenters to the device in your hand, because the more data you have, the more you’ll want to customize it to your particular needs.

Similarly, AI was a major theme of the show, and it’s also likely to suck up computing cycles everywhere. The vast majority of the work will, of course, end up in datacenters, but some processing is likely to be local, particularly in situations, such as real-time translation, where we can’t afford significant comm delays.

5G, the next big step in mobile data speeds, was everywhere, with most companies seeming to agree the new standard was still years away–but also excited about what will be possible. When you can stream 4K movies to your phone wirelessly while simultaneously receiving and customizing analyses of your company’s IoT network, you’re going to need a powerful, sophisticated device running equally powerful and sophisticated apps.

Everywhere I looked, the future was bright—and complicated, and likely to place increasing demands on all of our devices. We’ll need guides as we find our paths through these new territories and as we determine the right device tools for our jobs, so the need for the XPRTs will only increase. I look forward to seeing where we, the BenchmarkXPRT Development Community, take them next.

Mark

A new reality

A while back, I wrote about a VR demo built by students from North Carolina State University. We’ve been checking it out over the last couple of months and are very impressed. This workload will definitely heat up your device! While the initial results look promising, this is still an experimental workload and it’s too early to use results in formal reviews or product comparisons.

We’ve created a page that tells all about the VR demo. As an experimental workload, the demo is available only to community members. As always, members can download the source as well as the APK.

We asked the students to try to build the workload for iOS as a stretch goal. They successfully built an iOS version, but this was at the end of the semester and there was little time for testing. If you want to experiment with iOS yourself, look at the build instructions for Android and iOS that we include with the source. Note that you will need Xcode to build and deploy the demo on iOS.

After you’ve checked out the workload, let us know what you think!

Finally, we have a new video featuring the VR demo. Enjoy!

vr-demo-video

Eric

Experience is the best teacher

One of the core principles that guides the design of the XPRT tools is they should reflect the way real-world users use their devices. The XPRTs try to use applications and workloads that reflect what users do and the way that real applications function. How did we learn how important this is? The hard way—by making mistakes! Here’s one example.

In the 1990s, I was Director of Testing for the Ziff-Davis Benchmark Operation (ZDBOp). The benchmarks ZDBOp created for its technical magazines became the industry standards, because of both their quality and Ziff-Davis’ leadership in the technical trade press.

WebBench, one of the benchmarks ZDBOp developed, measured the performance of early web servers. We worked hard to create a tool that used physical clients and tested web server performance over an actual network. However, we didn’t pay enough attention to how clients actually interacted with the servers. In the first version of WebBench, the clients opened connections to the server, did a small amount of work, closed the connections, and then opened new ones.

When we met with vendors after the release of WebBench, they begged us to change the model. At that time, browsers opened relatively long-lived connections and did lots of work before closing them. Our model was almost the opposite of that. It put vendors in the position of having to choose between coding to give their users good performance and coding to get good WebBench results.

Of course, we were horrified by this, and worked hard to make the next version of the benchmark reflect more closely the way real browsers interacted with web servers. Subsequent versions of WebBench were much better received.

This is one of the roots from which the XPRT philosophy grew. We have tried to learn and grow from the mistakes we’ve made. We’d love to hear about any of your experiences with performance tools so we can all learn together.

Eric

A new HDXPRT 2014 build is available

Last fall, we identified a way to run HDXPRT 2014, originally developed for Windows 8, on Windows 10. The method involved overwriting the HDXPRT CPU-Z files with newer versions and performing a few additional pre-test configuration steps. You can read more details about those steps here.

Today, we’re releasing a new build of HDXPRT 2014 (v1.2) that eliminates the need to overwrite the CPU-Z files. The new build is available for download at HDXPRT.com. Please note that the app package is 5.08 GB, so allow time and space for the download process.

We also updated the HDXPRT 2014 User Manual to reflect changes in pre-test system configuration and to include the settings we recommend for newer builds of Windows 10.

The changes in the new build do not affect results, so v1.2 scores are comparable to v1.1 scores on the same system.

The new build ran well during testing in our labs, but issues could emerge as Microsoft releases new Windows updates. If you have any questions about HDXPRT or encounter any issues during testing, we encourage you to let us know.

We look forward to seeing your test results!

Justin

TouchXPRT’s future

If you’ve been following the blog, you know that we’ve been reviewing each part of the XPRT portfolio. If you missed our discussions of HDXPRT, BatteryXPRT, WebXPRT, and CrXPRT, we encourage you to check them out and send us any thoughts you may have. This week, we continue that series by discussing the state of TouchXPRT and what we see down the road for it in 2017.

We released TouchXPRT 2016, an app for evaluating the performance of Windows 10 and Windows 10 Mobile devices, last February. We built the app by porting TouchXPRT 2014 performance workloads to the new Universal Windows App format, which allows a single app package to run on PCs, phones, tablets, and even consoles.

TouchXPRT 2016 installation is quick and easy, and the test completes in under 15 minutes on most devices. The app runs tests based on five everyday tasks (Beautify Photos, Blend Photos, Convert Videos for Sharing, Create Music Podcast, and Create Slideshow from Photos). It measures how long your device takes to complete each task, produces results for each scenario, and gives you an overall score.

As we think about the path forward for TouchXPRT, we’re aware that many expect 2017 to be a year of significant change in the Windows world, with two updates scheduled for release. Microsoft is slated to release the Windows 10 Creators Update (Build 1704) in April, and a subsequent version of Windows codenamed Redstone 3 may arrive this fall. Many tech observers believe that the Creators Update will introduce new creativity and gaming features, along with a UI upgrade named Project NEON. Major foundational shifts in the OS’s structure are more likely to appear with Redstone 3. At this point, quite a lot is still up in the air, but we’ll be following developments closely.

As we learn more about upcoming changes, we’ll have the opportunity to reevaluate TouchXPRT workloads and determine the best way to incorporate new technologies. Virtual reality, 3D, and 4K are especially exciting, but it’s too soon to know how we might incorporate them in a future version of TouchXPRT.

Because TouchXPRT 2016 continues to run well on a wide range of Windows 10 devices, we think it’s best to keep supporting the current version until we get a better idea of what’s in store for Windows.

If you have any thoughts on the future of Windows performance testing, please let us know!

Bill

Check out the other XPRTs:

Forgot your password?