BenchmarkXPRT Blog banner

Category: Google Chrome

CrXPRT helps to navigate the changing Chromebook market

Some people envision Chromebooks as low-end, plastic-shelled laptops that large organizations buy in bulk because they’re inexpensive and easy to manage. While many sub-$200 Chromebooks are still available, the platform is no longer limited to budget chipsets and little memory. Consumers can now choose systems that feature up to 16 GB of RAM, 8th generation Intel Core CPUs, and Core i7 configurations for those willing to pay around $1,600. In addition, some Chromebooks can now run Android apps, Microsoft Office mobile apps, Linux apps, and even Windows apps. While Chromebooks still depend heavily on connectivity and cloud storage, an increasing number of Chrome apps let you perform substantial productivity tasks offline. The Chrome OS landscape has changed so much that for certain use cases, the practical hardware gap between Chromebooks and traditional laptops is narrowing.

More consumers might be interested in Chromebooks than was the case a few years ago, but how they make sense of all the devices on the market? CrXPRT can help by providing objective data on Chromebook performance and battery life. Steven J. Vaughan Nichols offered a great example of the value CrXPRT can provide in his recent ZDNet article on the new Core i7-based Google Pixelbook. The Pixelbook’s CrXPRT score of 226 showed that it performs everyday tasks faster than any of the Chromebooks in our results database. When trying to decide whether it’s worth spending a few hundred or even a thousand dollars more on a new Chromebook, having the right data in hand can transform guesses into well-informed decisions.

You don’t have to be a tech journalist or even a techie to use CrXPRT. If you’d like to learn more about CrXPRT, we encourage you to read the CrXPRT feature here in the blog or visit CrXPRT.com.

Justin

WebXPRT and user-agent strings

After running WebXPRT in Microsoft Edge, a tester recently asked why the browser information field on the results page displayed “Chrome 52 – Edge 15.15063.” It’s a good question; why would the benchmark report Chrome 52 when Microsoft Edge is the browser under test? The answer lies in understanding user-agent strings and the way that WebXPRT gathers specific bits of information.

When browsers request a web page from a hosting server, they send an array of basic header information that allows the server to determine the client’s capabilities and the best way to provide the requested content. One of these headers, the user-agent, presents a string of tokens that provide information about the application making the request, the operating system and version, rendering engine compatibility, and browser platform details. In effect, the user-agent string is a way for a browser to tell the hosting server the full extent of its capabilities.

When WebXPRT attempts to identify a browser, it references the browser token in the user-agent string.

The process is generally straightforward, but in some cases, browsers spoof information from other browsers in their user-agent strings, which makes accurate browser detection difficult. The reasons for this are complex, but they involve web development practices and the fact that some web pages are not designed to recognize and work well with new or less-popular browsers. When we released WebXPRT 2015, Microsoft Edge was new. The Edge team wanted to make sure that as much advanced web content as possible would be available to Edge users, so they created a user-agent string that declared itself to be several different browsers at once.

I can see this in action if I check Edge’s user-agent string on my system. Currently, it reports “Mozilla/5.0 (Windows NT 10.0; Win64; x64; ServiceUI 9) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36 Edge/15.15063.” So, because of the way Edge’s user-agent string is constructed, and the way WebXPRT parses that information, the browser information field on WebXPRT’s results page will report “Chrome 52 – Edge 15.15063” on my system.

To try this on your system, in Edge, select the ellipses icon in the top right-hand corner, then F12 Developer Tools. Next, select the Console tab, and run “javascript:alert(navigator.userAgent).” A popup window will display the UA string.

You can find instructions for finding the user-agent string in other browsers here: http://techdows.com/2016/07/edge-ie-chrome-firefox-user-agent-strings.html.

In the next version of WebXPRT, we’ll work to refine the way that the test parses user-agent strings, and provide more accurate system information for testers. If you have any questions or suggestions regarding this topic, let us know!

Justin

Apples and pears vs. oranges and bananas

When people talk about comparing disparate things, they often say that you’re comparing apples and oranges. However, sometimes that expression doesn’t begin to describe the situation.

Recently, Justin wrote about using CrXPRT on systems running Neverware CloudReady OS. In that post, he noted that we couldn’t guarantee that using CrXPRT on CloudReady and Chrome OS systems would be a fair comparison. Not surprisingly, that prompted the question “Why not?”

Here’s the thing: It’s a fair comparison of those software stacks running on those hardware configurations. If everyone accepted that and stopped there, all would be good. However, almost inevitably, people will read more into the scores than is appropriate.

In such a comparison, we’re changing multiple variables at once. We’ve written before about the effect of the software stack on performance. CloudReady and Chrome OS are two different implementations of the Chromium OS, and it’s possible that one is more efficient than the other. If so, that would affect CrXPRT scores. At the same time, the raw performance of the two hardware configurations under test could also differ to a certain degree, which would also affect CrXPRT scores.

Here’s a metaphor: If you measure the effective force at the end of two levers and find a difference, to what do you attribute that difference? If you know the levers are the same length, you can attribute the difference to the amount of applied force. If you know the applied force is identical, you can attribute the difference to the length of the levers. If you lack both of those data points, you can’t know whether the difference is due to the length, the force, or a combination of the two.

With a benchmark, you can run multiple experiments designed to isolate variables and use the results from those experiments to look for trends. For example, we could install both CloudReady OS and Chrome OS on the same Intel-based Chromebook and compare the CrXPRT results. Because that removes hardware differences as a variable, such an experiment would offer some insight into how the two implementations compare. However, because differences in hardware can affect the performance of a given piece of software, this single data point would be of limited value. We could repeat the experiment on a variety of other Intel-based Chromebooks, and other patterns might emerge. If one of the implementations consistently scored higher, that would suggest that it was more efficient than the other, but would still not be definitively conclusive.

I hope this gives you some idea about why we are cautious about drawing conclusions when comparing results from different sets of hardware running different software stacks.

Eric

Another pronunciation lesson

Knowing how to say the terms we read on-line can be a bit of a mystery. For example, it’s been 30 years since CompuServe created the GIF format, and people are still arguing about how to say it.

A couple of months ago, we talked about how to pronounce WebXPRT. In the video we pointed to, the narrator openly says he’s confused about how to say “XPRT.” For the record, it’s pronounced “expert.”

Recently, we came across another video, which referred to CrXPRT. The narrator pronounced it “Chrome expert.” The “expert” part is correct, but the “Chrome” part is not. It’s an understandable mistake, because Cr is the chemical symbol for chromium. That’s why we chose it! However, we pronounce the C and R individually. So, the name is said “C R expert.”

All that being said, it was great to see CrXPRT in the classroom! When we created CrXPRT, the education market was a big consideration, as you can tell from this CrXPRT video. We love seeing the XPRTs in the real world!

Eric

CrXPRT: a valuable tool for evaluating Chromebooks

Last week, we reintroduced readers to TouchXPRT 2016. This week, we invite you to get to know CrXPRT, an app for Chrome OS devices.

When you buy a Chromebook, it’s important to know how long the battery will last on a typical day and how well it can handle everyday tasks. We developed CrXPRT 2015 to help answer those questions. CrXPRT measures how fast a Chromebook handles everyday tasks such as playing video games, watching movies, editing pictures, and doing homework, and it also measures battery life. The performance test, which measures the speed of your Chromebook, gives you individual workload scores and an overall score. The battery life test produces an estimated battery life time, a separate performance score, and a frames-per-second (FPS) rate for a built-in HTML5 gaming component.

CrXPRT completes the battery life evaluation in half a workday, and delivers results you can understand. Before CrXPRT, you had to rely on the manufacturer’s performance claims and estimated battery life. Now, CrXPRT provides an objective evaluation tool that’s easy to use for anyone interested in Chromebooks. To learn more about CrXPRT, check out the links below.

Watch CrXPRT in action:

CrXPRT in action

To test your Chromebook’s performance or battery life:

Simply download CrXPRT from the Chrome Web Store. Installation is quick and easy, and the CrXPRT 2015 user manual provides step-by-step instructions. A typical performance test takes about 15 minutes, and a battery life test will take 3.5 hours once the system is charged and configured for testing. If you’d like to see how your score compares to other Chromebooks, visit the CrXPRT results page.

If you’d like to dig into the details:

Read the Exploring CrXPRT 2015 white paper. In it, we discuss the concepts behind CrXPRT 2015, its development process, and the application’s structure. We also describe the component tests and explain the statistical processes used to calculate expected battery life.

BenchmarkXPRT Development Community members also have access to the CrXPRT source code, so if you’re interested, join today! There’s no obligation and membership is free for members of any company or organization with an interest in benchmarks.

If you have a Chromebook you’d like to evaluate, give CrXPRT a try and let us know what you think!

Justin

Running Android-oriented XPRTs on Chrome OS

Since last summer, we’ve been following Google’s progress in bringing Android apps and the Google Play store to Chromebooks, along with their plan to gradually phase out support for Chrome apps over the next few years. Because we currently offer apps that assess battery life and performance for Android devices (BatteryXPRT and MobileXPRT) and Chromebooks (CrXPRT), the way this situation unfolds could affect the makeup of the XPRT portfolio in the years to come.

For now, we’re experimenting to see how well the Android app/Chrome OS merger is working with the devices in our lab. One test case is the Samsung Chromebook Plus, which we featured in the XPRT Weekly Tech Spotlight a few weeks ago. Normally, we would publish only CrXPRT and WebXPRT results for a Chromebook, but installing and running MobileXPRT 2015 from the Google Play store was such a smooth and error-free process that we decided to publish the first MobileXPRT score for a device running Chrome OS.

We also tried running BatteryXPRT on the Chromebook Plus, but even though the installation was quick and easy and the test kicked off without a hitch, we could not generate a valid result. Typically, the test would complete several iterations successfully, but terminate before producing a result. We’re investigating the problem, and will keep the community up to date on what we find.

In the meantime, we continue to recommend that Chromebook testers use CrXPRT for performance and battery life assessment. While we haven’t encountered any issues running MobileXPRT 2015 on Chromebooks, CrXPRT has a proven track record.

If you have any questions about running Android-oriented XPRTs on Chrome OS, or insights that you’d like to share, please let us know.

Justin

Check out the other XPRTs:

Forgot your password?