BenchmarkXPRT Blog banner

HDXPRT: see how your Windows PC handles media tasks

Over the last several weeks, we reminded readers of the capabilities and benefits of TouchXPRT, CrXPRT, and BatteryXPRT. This week, we’d like to highlight HDXPRT. HDXPRT, which stands for High Definition Experience & Performance Ratings Test, was the first benchmark published by the HDXPRT Development Community, which later became the BenchmarkXPRT Development Community. HDXPRT evaluates the performance of Windows devices while handling real-world media tasks such as photo editing, video conversion, and music editing, all while using real commercial applications, including Photoshop and iTunes. HDXPRT presents results that are relevant and easy to understand.

We originally distributed HDXPRT on installation DVDs, but HDXPRT 2014, the latest version, is available for download from HDXPRT.com. HDXPRT 2014 is for systems running Windows 8.1 and later. The benchmark takes about 10 minutes to install, and a run takes less than two hours.

HDXPRT is a useful tool for anyone who wants to evaluate the real-world, content-creation capabilities of a Windows PC. To see test results from a variety of systems, go to HDXPRT.com and click View Results, where you’ll find scores from many different Windows devices.

If you’d like to run HDXPRT:

Simply download HDXPRT from HDXPRT.com. The HDXPRT user manual provides information on minimum system requirements, as well as step-by-step instructions for how to configure your system and kick off a test. Testers running HDXPRT on Windows 10 Creators Update builds should consult the tech support note posted on HDXPRT.com.

If you’d like to dig into the details:

Check out the Exploring HDXPRT 2014 white paper. In it, we discuss the benchmark’s three test scenarios in detail and show how we calculate the results.

If you’d like to dig even deeper, the HDXPRT source code is available to members of the BenchmarkXPRT Development Community, so consider joining today. Membership is free for members of any company or organization with an interest in benchmarks, and there are no obligations after joining.

If you haven’t used HDXPRT before, give it a shot and let us know what you think!

On another note, Bill will be attending Mobile World Congress in Shanghai next week. Let us know if you’d like to meet up and discuss the XPRTs or how to get your device in the XPRT Spotlight.

Justin

Notes from the lab

This week’s XPRT Weekly Tech Spotlight featured the Alcatel A30 Android phone. We chose the A30, an Amazon exclusive, because it’s a budget phone running Android 7.0 (Nougat) right out of the box. That may be an appealing combination for consumers, but running a newer OS on inexpensive hardware such as what’s found in the A30 can cause issues for app developers, and the XPRTs are no exception.

Spotlight fans may have noticed that we didn’t post a MobileXPRT 2015 or BatteryXPRT 2014 score for the A30. In both cases, the benchmark did not produce an overall score because of a problem that occurs during the Create Slideshow workload. The issue deals with text relocation and significant changes in the Android development environment.

As of Android 5.0, on 64-bit devices, the OS doesn’t allow native code executables to perform text relocation. Instead, it is necessary to compile the executables using position-independent code (PIC) flags. This is how we compiled the current version of MobileXPRT, and it’s why we updated BatteryXPRT earlier this year to maintain compatibility with the most recent versions of Android.

However, the same approach doesn’t work for SoCs built with older 32-bit ARMv7-A architectures, such as the A30’s Qualcomm Snapdragon 210, so testers may encounter this issue on other devices with low-end hardware.

Testers who run into this problem can still use MobileXPRT 2015 to generate individual workload scores for the Apply Photo Effects, Create Photo Collages, Encrypt Personal Content, and Detect Faces workloads. Also, BatteryXPRT will produce an estimated battery life for the device, but since it won’t produce a performance score, we ask that testers use those numbers for informational purposes and not publication.

If you have any questions or have encountered additional issues, please let us know!

Justin

Evaluating machine learning performance

A  few weeks ago, I discussed the rising importance of machine learning and our efforts to develop a tool to help in evaluating its performance. Here is an update on our thinking.

One thing we are sure of is that we can’t cover everything in machine learning. The field is evolving rapidly, so we think the best approach is to pick a good place to start and then build from there.

One of the key areas we need to hone in on is the algorithms that we will employ in MLXPRT. (We haven’t formally decided on a name, but are currently using MLXPRT internally when we talk about what we’ve been doing.)

Computer vision, or image detection, seems to be a good place to start. We see three specific sets of algorithms to possibly cover. Worth noting, there is plenty of muddying of lines amongst these sets.

The first set of computer vision algorithms performs image classification. These algorithms identify things like a cat or a dog in an image. Some of the most popular algorithms are Alexnet and GoogLeNet, as well as ones from VGG . The initial training and use for these was on the ImageNet database, containing over 10 million images.

The next set of algorithms in computer vision performs object detection and localization. The algorithms identify the contents and their spatial location in an image, and typically draw bounding boxes around them. A couple of the most popular algorithms are Faster R-CNN and Single Shot MultiBox Detector (SSD).

The final set of computer vision algorithms perform image segmentation. Rather than just drawing a box around an object, image segmentation attempts to classify each pixel in an image by the object it is a part of. The result looks like a contour/color map that shows the different objects in the image. These techniques can be especially useful in autonomous vehicles and medical diagnostic imaging. Currently, the leading algorithms in image segmentation are fully convolution networks (FCN), but the area is developing rapidly.

Even limiting the initial version of MLXPRT to computer vision may be too broad. For example, we may end up only doing image classification and object detection.

As always, we crave input from folks, like yourself, who are working in these areas. What would you most like to see in a machine learning performance tool?

Bill

Learning something new every day

We’re constantly learning and thinking about how the XPRTs can help people evaluate the tech that will soon be a part of daily life. It’s why we started work on a tool to evaluate machine learning capabilities, and it’s why we developed CrXPRT in response to Chromebooks’ growing share of the education sector.

The learning process often involves a lot of tinkering in the lab, and we recently began experimenting with Neverware’s CloudReady OS. CloudReady is an operating system based on the open-source Chromium OS. Unlike Chrome OS, which can run on only Chromebooks, CloudReady can run on many types of systems, including older Windows and OS X machines. The idea is that individuals and organizations can breathe new life into aging hardware by incorporating it into a larger pool of devices managed through a Google Admin Console.

We were curious to see if it worked as advertised, and if it would run CrXPRT 2015. Installing CloudReady on an old Dell Latitude E6430 was easy enough, and we then installed CrXPRT from the Chrome Web Store. Performance tests ran without a hitch. Battery life tests would kick off but not complete, which was not a big surprise because the battery life calls involved were developed specifically for Chrome OS.

So, what role can CrXPRT play with CloudReady, and what are the limitations? CloudReady has a lot in common with Chrome OS, but there are some key differences. One way we see the CrXPRT performance test being useful is for comparing CloudReady devices. Say that an organization was considering adopting CloudReady on certain legacy systems but not on others; CrXPRT performance scores would provide insight into which devices performed better with CloudReady. While you could use CrXPRT to compare those devices to Chromebooks, the differences between the operating systems are significant enough that we cannot guarantee the comparison would be a fair one.

Have you spent any time working with CloudReady, or are there other interesting new technologies you’d like us to investigate? Let us know!

Justin

BatteryXPRT: A quick and reliable way to estimate Android battery life

In the last few weeks, we reintroduced readers to the capabilities and benefits of TouchXPRT and CrXPRT. This week, we’d like to reintroduce BatteryXPRT 2014 for Android, an app that evaluates the battery life and performance of Android devices.

When purchasing a phone or tablet, it’s good to know how long the battery will last on a typical day and how often you’ll need to charge it. Before BatteryXPRT, you had to rely on a manufacturer’s estimate or full rundown tests that perform tasks that don’t resemble the types of things we do with our phones and tablets every day.

We developed BatteryXPRT to estimate battery life reliably in just over five hours, so testers can complete a full evaluation in one work day or while sleeping. You can configure it to run while the device is connected to a network or in Airplane mode. The test also produces a performance score by running workloads that represent common everyday tasks.

BatteryXPRT is easy to install and run, and is a great resource for anyone who wants to evaluate how well an Android device will meet their needs. If you’d like to see test results from a variety of Android devices, go to BatteryXPRT.com and click View Results, where you’ll find scores from many different Android devices.

If you’d like to run BatteryXPRT:

Simply download BatteryXPRT from the Google Play store or BatteryXPRT.com. The BatteryXPRT installation instructions and user manual provide step-by-step instructions for how to configure your device and kick off a test. We designed BatteryXPRT 2014 for Android to be compatible with a wide variety of Android devices, but because there are so many devices on the market, it is inevitable that users occasionally run into problems. In the Tips, tricks, and known issues document, we provide troubleshooting suggestions for issues we encountered during development testing.

If you’d like to learn more:

We offer a full online BatteryXPRT training course that covers almost every aspect of the benchmark. You can view the sections in order or jump to the parts that interest you. We guarantee that you’ll learn something new!

BatteryXPRT 2014 for Android Training Course

If you’d like to dig into the details:

Check out the Exploring BatteryXPRT 2014 for Android white paper. In it, we discuss the app’s development and structure. We also describe the component tests; explain the differences between the test’s Airplane, Wi-Fi, and Cellular modes; and detail the statistical processes we use to calculate expected battery life.

If you’d like to dig even deeper, the BatteryXPRT source code is available to members of the BenchmarkXPRT Development Community, so consider joining today. Membership is free for members of any company or organization with an interest in benchmarks, and there are no obligations after joining.

If you haven’t used BatteryXPRT before, try it out and let us know what you think!

Justin

Another pronunciation lesson

Knowing how to say the terms we read on-line can be a bit of a mystery. For example, it’s been 30 years since CompuServe created the GIF format, and people are still arguing about how to say it.

A couple of months ago, we talked about how to pronounce WebXPRT. In the video we pointed to, the narrator openly says he’s confused about how to say “XPRT.” For the record, it’s pronounced “expert.”

Recently, we came across another video, which referred to CrXPRT. The narrator pronounced it “Chrome expert.” The “expert” part is correct, but the “Chrome” part is not. It’s an understandable mistake, because Cr is the chemical symbol for chromium. That’s why we chose it! However, we pronounce the C and R individually. So, the name is said “C R expert.”

All that being said, it was great to see CrXPRT in the classroom! When we created CrXPRT, the education market was a big consideration, as you can tell from this CrXPRT video. We love seeing the XPRTs in the real world!

Eric

Check out the other XPRTs:

Forgot your password?