BenchmarkXPRT Blog banner

Category: Chrome OS

Learning something new every day

We’re constantly learning and thinking about how the XPRTs can help people evaluate the tech that will soon be a part of daily life. It’s why we started work on a tool to evaluate machine learning capabilities, and it’s why we developed CrXPRT in response to Chromebooks’ growing share of the education sector.

The learning process often involves a lot of tinkering in the lab, and we recently began experimenting with Neverware’s CloudReady OS. CloudReady is an operating system based on the open-source Chromium OS. Unlike Chrome OS, which can run on only Chromebooks, CloudReady can run on many types of systems, including older Windows and OS X machines. The idea is that individuals and organizations can breathe new life into aging hardware by incorporating it into a larger pool of devices managed through a Google Admin Console.

We were curious to see if it worked as advertised, and if it would run CrXPRT 2015. Installing CloudReady on an old Dell Latitude E6430 was easy enough, and we then installed CrXPRT from the Chrome Web Store. Performance tests ran without a hitch. Battery life tests would kick off but not complete, which was not a big surprise because the battery life calls involved were developed specifically for Chrome OS.

So, what role can CrXPRT play with CloudReady, and what are the limitations? CloudReady has a lot in common with Chrome OS, but there are some key differences. One way we see the CrXPRT performance test being useful is for comparing CloudReady devices. Say that an organization was considering adopting CloudReady on certain legacy systems but not on others; CrXPRT performance scores would provide insight into which devices performed better with CloudReady. While you could use CrXPRT to compare those devices to Chromebooks, the differences between the operating systems are significant enough that we cannot guarantee the comparison would be a fair one.

Have you spent any time working with CloudReady, or are there other interesting new technologies you’d like us to investigate? Let us know!

Justin

Another pronunciation lesson

Knowing how to say the terms we read on-line can be a bit of a mystery. For example, it’s been 30 years since CompuServe created the GIF format, and people are still arguing about how to say it.

A couple of months ago, we talked about how to pronounce WebXPRT. In the video we pointed to, the narrator openly says he’s confused about how to say “XPRT.” For the record, it’s pronounced “expert.”

Recently, we came across another video, which referred to CrXPRT. The narrator pronounced it “Chrome expert.” The “expert” part is correct, but the “Chrome” part is not. It’s an understandable mistake, because Cr is the chemical symbol for chromium. That’s why we chose it! However, we pronounce the C and R individually. So, the name is said “C R expert.”

All that being said, it was great to see CrXPRT in the classroom! When we created CrXPRT, the education market was a big consideration, as you can tell from this CrXPRT video. We love seeing the XPRTs in the real world!

Eric

CrXPRT: a valuable tool for evaluating Chromebooks

Last week, we reintroduced readers to TouchXPRT 2016. This week, we invite you to get to know CrXPRT, an app for Chrome OS devices.

When you buy a Chromebook, it’s important to know how long the battery will last on a typical day and how well it can handle everyday tasks. We developed CrXPRT 2015 to help answer those questions. CrXPRT measures how fast a Chromebook handles everyday tasks such as playing video games, watching movies, editing pictures, and doing homework, and it also measures battery life. The performance test, which measures the speed of your Chromebook, gives you individual workload scores and an overall score. The battery life test produces an estimated battery life time, a separate performance score, and a frames-per-second (FPS) rate for a built-in HTML5 gaming component.

CrXPRT completes the battery life evaluation in half a workday, and delivers results you can understand. Before CrXPRT, you had to rely on the manufacturer’s performance claims and estimated battery life. Now, CrXPRT provides an objective evaluation tool that’s easy to use for anyone interested in Chromebooks. To learn more about CrXPRT, check out the links below.

Watch CrXPRT in action:

CrXPRT in action

To test your Chromebook’s performance or battery life:

Simply download CrXPRT from the Chrome Web Store. Installation is quick and easy, and the CrXPRT 2015 user manual provides step-by-step instructions. A typical performance test takes about 15 minutes, and a battery life test will take 3.5 hours once the system is charged and configured for testing. If you’d like to see how your score compares to other Chromebooks, visit the CrXPRT results page.

If you’d like to dig into the details:

Read the Exploring CrXPRT 2015 white paper. In it, we discuss the concepts behind CrXPRT 2015, its development process, and the application’s structure. We also describe the component tests and explain the statistical processes used to calculate expected battery life.

BenchmarkXPRT Development Community members also have access to the CrXPRT source code, so if you’re interested, join today! There’s no obligation and membership is free for members of any company or organization with an interest in benchmarks.

If you have a Chromebook you’d like to evaluate, give CrXPRT a try and let us know what you think!

Justin

Running Android-oriented XPRTs on Chrome OS

Since last summer, we’ve been following Google’s progress in bringing Android apps and the Google Play store to Chromebooks, along with their plan to gradually phase out support for Chrome apps over the next few years. Because we currently offer apps that assess battery life and performance for Android devices (BatteryXPRT and MobileXPRT) and Chromebooks (CrXPRT), the way this situation unfolds could affect the makeup of the XPRT portfolio in the years to come.

For now, we’re experimenting to see how well the Android app/Chrome OS merger is working with the devices in our lab. One test case is the Samsung Chromebook Plus, which we featured in the XPRT Weekly Tech Spotlight a few weeks ago. Normally, we would publish only CrXPRT and WebXPRT results for a Chromebook, but installing and running MobileXPRT 2015 from the Google Play store was such a smooth and error-free process that we decided to publish the first MobileXPRT score for a device running Chrome OS.

We also tried running BatteryXPRT on the Chromebook Plus, but even though the installation was quick and easy and the test kicked off without a hitch, we could not generate a valid result. Typically, the test would complete several iterations successfully, but terminate before producing a result. We’re investigating the problem, and will keep the community up to date on what we find.

In the meantime, we continue to recommend that Chromebook testers use CrXPRT for performance and battery life assessment. While we haven’t encountered any issues running MobileXPRT 2015 on Chromebooks, CrXPRT has a proven track record.

If you have any questions about running Android-oriented XPRTs on Chrome OS, or insights that you’d like to share, please let us know.

Justin

Digging deeper

From time to time, we like to revisit the fundamentals of the XPRT approach to benchmark development. Today, we’re discussing the need for testers and benchmark developers to consider the multiple factors that influence benchmark results. For every device we test, all of its hardware and software components have the potential to affect performance, and changing the configuration of those components can significantly change results.

For example, we frequently see significant performance differences between different browsers on the same system. In our recent recap of the XPRT Weekly Tech Spotlight’s first year, we highlighted an example of how testing the same device with the same benchmark can produce different results, depending on the software stack under test. In that instance, the Alienware Steam Machine entry included a WebXPRT 2015 score for each of the two browsers that consumers were likely to use. The first score (356) represented the SteamOS browser app in the SteamOS environment, and the second (441) represented the Iceweasel browser (a Firefox variant) in the Linux-based desktop environment. Including only the first score would have given readers an incomplete picture of the Steam Machine’s web-browsing capabilities, so we thought it was important to include both.

We also see performance differences between different versions of the same browser, a fact especially relevant to those who use frequently updated browsers, such as Chrome. Even benchmarks that measure the same general area of performance, for example, web browsing, are usually testing very different things.

OS updates can also have an impact on performance. Consumers might base a purchase on performance or battery life scores and end up with a device that behaves much differently when updated to a new version of Android or iOS, for example.

Other important factors in the software stack include pre-installed software, commonly referred to as bloatware, and the proliferation of apps that sap performance and battery life.

This is a much larger topic than we can cover in the blog. Let the examples we’ve mentioned remind you to think critically about, and dig deeper into, benchmark results. If we see published XPRT scores that differ significantly from our own results, our first question is always “What’s different between the two devices?” Most of the time, the answer becomes clear as we compare hardware and software from top to bottom.

Justin

Celebrating one year of the XPRT Weekly Tech Spotlight

It’s been just over a year since we launched the XPRT Weekly Tech Spotlight by featuring our first device, the Google Pixel C. Spotlight has since become one of the most popular items at BenchmarkXPRT.com, and we thought now would be a good time to recap the past year, offer more insight into the choices we make behind the scenes, and look at what’s ahead for Spotlight.

The goal of Spotlight is to provide PT-verified specs and test results that can help consumers make smart buying decisions. We try to include a wide variety of device types, vendors, software platforms, and price points in our inventory. The devices also tend to fall into one of two main groups: popular new devices generating a lot of interest and devices that have unique form factors or unusual features.

To date, we’ve featured 56 devices: 16 phones, 11 laptops, 10 two-in-ones, 9 tablets, 4 consoles, 3 all-in-ones, and 3 small-form-factor PCs. The operating systems these devices run include Android, ChromeOS, iOS, macOS, OS X, Windows, and an array of vendor-specific OS variants and skins.

As much as possible, we test using out-of-the-box (OOB) configurations. We want to present test results that reflect what everyday users will experience on day one. Depending on the vendor, the OOB approach can mean that some devices arrive bogged down with bloatware while others are relatively clean. We don’t attempt to “fix” anything in those situations; we simply test each device “as is” when it arrives.

If devices arrive with outdated OS versions (as is often the case with Chromebooks), we update to current versions before testing, because that’s the best reflection of what everyday users will experience. In the past, that approach would’ve been more complicated with Windows systems, but the Microsoft shift to “Windows as a service” ensures that most users receive significant OS updates automatically by default.

The OOB approach also means that the WebXPRT scores we publish reflect the performance of each device’s default browser, even if it’s possible to install a faster browser. Our goal isn’t to perform a browser shootout on each device, but to give an accurate snapshot of OOB performance. For instance, last week’s Alienware Steam Machine entry included two WebXPRT scores, a 356 on the SteamOS browser app and a 441 on Iceweasel 38.8.0 (a Firefox variant used in the device’s Linux-based desktop mode). That’s a significant difference, but the main question for us was which browser was more likely to be used in an OOB scenario. With the Steam Machine, the answer was truly “either one.” Many users will use the browser app in the SteamOS environment and many will take the few steps needed to access the desktop environment. In that case, even though one browser was significantly faster than the other, choosing to omit one score in favor of the other would have excluded results from an equally likely OOB environment.

We’re always looking for ways to improve Spotlight. We recently began including more photos for each device, including ones that highlight important form-factor elements and unusual features. Moving forward, we plan to expand Spotlight’s offerings to include automatic score comparisons, additional system information, and improved graphical elements. Most importantly, we’d like to hear your thoughts about Spotlight. What devices and device types would you like to see? Are there specs that would be helpful to you? What can we do to improve Spotlight? Let us know!

Justin

Check out the other XPRTs:

Forgot your password?