BenchmarkXPRT Blog banner

Tag Archives: tablets

BatteryXPRT provides the objective battery life data that shoppers need

Over the last few weeks, we’ve discussed the capabilities and benefits of TouchXPRT and CrXPRT. This week, we’d like to reintroduce readers to BatteryXPRT, our app that evaluates the battery life and performance of Android devices.

Battery life for phones and tablets has improved dramatically over the last several years, to the point where many devices can support continuous use for well over a full work day on a single charge. This improvement is the result of advances in battery hardware technology, increased processor efficiency, and smarter utilization of software services by the operating system. Battery life has increased to some extent for most device categories and price points. However, enough of a range remains between devices at each level that access to objective battery life data is valuable for device shoppers.

Without BatteryXPRT, shoppers must rely on manufacturer estimates or full rundown tests that don’t resemble the types of things we do with our phones and tablets every day. A rundown test that surfs the web continuously for over 15 hours reveals which devices last the longest performing that specific task. It doesn’t tell you which devices last the longest over a full day performing a variety of common activities such as web browsing, watching videos, browsing and editing photos, playing music, and periodically sleeping. During BatteryXPRT’s battery life test, the app executes those same types of tasks and produces a performance score based on the speed with which a device completes each task.

BatteryXPRT provides an intuitive user interface in English and Simplified Chinese, and easy-to-understand results for both battery life and performance. Because your data connection can have a significant effect on battery life, BatteryXPRT runs in airplane mode, connected to the Internet via Wi-Fi, or connected to the Internet through a cellular data connection.

BatteryXPRT is easy to install and run, and is a great resource for anyone who wants to evaluate how well an Android device will meet their needs. If you’d like to see test results from a variety of Android devices, go to BatteryXPRT.com and click View Results, where you’ll find scores from many different Android devices.

If you’d like to run BatteryXPRT

Simply download BatteryXPRT from the Google Play store or BatteryXPRT.com. The BatteryXPRT installation instructions and user manual provide step-by-step instructions for configuring your device and kicking off a test. We designed BatteryXPRT to be compatible with a wide variety of Android devices, but because there are so many devices on the market, it is inevitable that users occasionally run into problems. In the Tips, tricks, and known issues document, we provide troubleshooting suggestions for issues we encountered during development testing.

If you’d like to learn more

The Exploring BatteryXPRT 2014 for Android white paper covers almost every aspect of the benchmark. In it, we explain the guiding concepts behind BatteryXPRT’s development, as well as the benchmark’s structure. We describe the component tests, the differences between the app’s Airplane and Network/Wi-Fi modes, and the statistical processes used to calculate expected battery life.

Justin

A new BatteryXPRT 2014 for Android build is available

In last week’s blog, we discussed why we now consider full BatteryXPRT rundown tests to be the most accurate and why we’re releasing a new build (v110) that increases the default BatteryXPRT test from 5.25 hours (seven iterations) to 45 hours (60 iterations). We also built v110 using Android Studio SDK 27, in order to bring BatteryXPRT up to date with current Android standards. Today, we’ve posted the new build on BatteryXPRT.com and in the Google Play Store, and we’ve also published an updated user manual. Please contact us if you have any questions about BatteryXPRT testing.

Justin

Planning the next version of MobileXPRT

We’re in the early planning stages for the next version of MobileXPRT, and invite you to send us any suggestions you may have. What do you like or not like about MobileXPRT? What features would you like to see in a new version?

When we begin work on a new version of any XPRT, one of the first steps we take is to assess the benchmark’s workloads to determine whether they will provide value during the years ahead. This step almost always involves updating test content such as photos and videos to more contemporary file resolutions and sizes, and it can also involve removing workloads or adding completely new scenarios. MobileXPRT currently includes five performance scenarios (Apply Photo Effects, Create Photo Collages, Create Slideshow, Encrypt Personal Content, and Detect Faces to Organize Photos). Should we stick with these five or investigate other use cases? What do you think?

As we did with WebXPRT 3 and the upcoming HDXPRT 4, we’re also planning to update the MobileXPRT UI to improve the look of the benchmark and make it easier to use.

Crucially, we’ll also build the app using the most current Android Studio SDK. Android development has changed significantly since we released MobileXPRT 2015 and apps must now conform to stricter standards that require explicit user permission for many tasks. Navigating these changes shouldn’t be too difficult, but it’s always possible that we’ll encounter unforeseen challenges at some point during the process.

Do you have suggestions for test scenarios that we should consider for MobileXPRT? Are there existing features we should remove? Are there elements of the UI that you find especially useful or have ideas for improving? Please let us know. We want to hear from you and make sure that MobileXPRT continues to meet your needs.

Justin

More on the way for the XPRT Weekly Tech Spotlight

In the coming months, we’ll continue to add more devices and helpful features to the XPRT Weekly Tech Spotlight. We’re especially interested in adding data points and visual aids that make it easier to quickly understand the context of each device’s test scores. For instance, those of us who are familiar with WebXPRT 3 scores know that an overall score of 250 is pretty high, but site visitors who are unfamiliar with WebXPRT probably won’t know how that score compares to scores for other devices.

We designed Spotlight to be a source of objective data, in contrast to sites that provide subjective ratings for devices. As we pursue our goal of helping users make sense of scores, we want to maintain this objectivity and avoid presenting information in ways that could be misleading.

Introducing comparison aids to the site is forcing us to make some tricky decisions. Because we value input from XPRT community members, we’d love to hear your thoughts on one of the questions we’re facing: How should our default view present a device’s score?

We see three options:

1) Present the device’s score in relation to the overall high and low scores for that benchmark across all devices.
2) Present the device’s score in relation to the overall high and low scores for that benchmark across the broad category of devices to which that device belongs (e.g., phones).
3) Present the device’s score in relation to the overall high and low scores for that benchmark across a narrower sub-category of devices to which that device belongs (e.g., high-end flagship phones).

To think this through, consider WebXPRT, which runs on desktops, laptops, phones, tablets, and other devices. Typically, the WebXPRT scores for phones and tablets are lower than scores for desktop and laptop systems. The first approach helps to show just how fast high-end desktops and laptops handle the WebXPRT workloads, but it could make a phone or tablet look slow, even if its score was good for its category. The second approach would prevent unfair default comparisons between different device types but would still present comparisons between devices that are not true competitors (e.g., flagship phones vs. budget phones). The third approach is the most careful, but would introduce an element of subjectivity because determining the sub-category in which a device belongs is not always clear cut.

Do you have thoughts on this subject, or recommendations for Spotlight in general? If so, Let us know.

Justin

Check out the other XPRTs:

Forgot your password?