BenchmarkXPRT Blog banner

Category: Benchmarking

Exploring virtual reality

We’ve talked a lot in recent weeks about new technologies we are evaluating for the XPRTs. You may remember that back in June, we also wrote about sponsoring a second senior project with North Carolina State University. Last week, the project ended with the traditional Posters and Pies event. The team gave a very well thought‑out presentation.

NCSU VR blog pic 1

As you can tell from the photo below, the team designed and implemented a nifty virtual reality app. It’s a room escape puzzle, and it looks great!

NCSU VR blog pic 2

The app is a playable game with the ability to record the gameplay for doing repeatable tests. It also includes a recording that allows you to test a device without playing the game. Finally, the app lets you launch directly into the prerecorded game without using a viewer, which will be handy for testing multiple devices.

The team built the app using the Google Cardboard API and the Unity game engine, which allowed them to create Android and iOS versions. We’re looking forward to seeing what that may tell us!

After Posters and Pies, the team came to PT to present their work and answer questions. We were all very impressed with their knowledge and with how well thought out the application was.

NCSU VR blog pic 3

Many thanks to team members Christian McCurdy, Gregory Manning, Grayson Jones, and Shon Ferguson (not shown).

NCSU VR blog pic 4

Thanks also to Dr. Lina Battestilli, the team’s technical advisor, and Margaret Heil, Director of the Senior Design Center.

We are currently evaluating the app, and expect to make it available to the community in early 2017!

Eric

 

WebXPRT in 2017

Over the last few weeks, we’ve discussed the future of HDXPRT and BatteryXPRT. This week, we’re discussing what’s in store for WebXPRT in 2017.

WebXPRT is our most popular tool. Manufacturers, developers, consumers, and media outlets in more than 350 cities and 57 countries have run WebXPRT over 113,000 times to date. The benchmark runs quickly and simply in most browsers and produces easy-to-understand results that are useful for comparing web browsing performance across a wide variety of devices and browsers. People love the fact that WebXPRT runs on almost any platform that has a web browser, from PCs to phones to game consoles.

More people are using WebXPRT in more places and in more ways than ever before. It’s an unquestioned success, but we think this is a good time to make it even better by beginning work on WebXPRT 2017. Any time change comes to a popular product, there’s a risk that faithful fans will lose the features and functionality they’ve grown to love. Relevant workloads, ease of use, and extensive compatibility have always been the core components of WebXPRT’s success, so we want to reassure users that we’re committed to maintaining all of those in future versions.

Some steps in the WebXPRT 2017 process are straightforward, such as the need to reassess the existing workload lineup and update content to reflect advances in commonly used technologies. Other steps, such as introducing new workloads to test emerging browser technologies, may be tricky to implement, but could offer tremendous value in the months and years ahead.

Are there test scenarios or browser technologies you would like to see in WebXPRT 2017, or tests you think we should get rid of? Please let us know. We want to hear from you and make sure that we’re crafting a performance tool that continues to meet your needs.

Bill

BatteryXPRT’s future

A few weeks ago, we discussed the future of HDXPRT. This week, we’re focusing on the current state of BatteryXPRT 2014 for Android, and how the benchmark may evolve in 2017.

BatteryXPRT continues to provide users with reliable evaluations of their Android device’s performance and battery life under real-world conditions. Originally designed to be compatible with Android 4.2 (Jelly Bean) and above, the benchmark continues to work well on subsequent versions of Android, up to and including Android 6.0 (Marshmallow).

Since Android 7 (Nougat) began to roll out on select devices in the past few months, our internal testing has shown that we’ll need to adjust the BatteryXPRT source code to maintain compatibility with devices running Android 7 and above. We developed the existing source when Eclipse was the officially supported SDK environment, and now we need to bring the code in line with the current Android Studio SDK.

Practically speaking, BatteryXPRT does run on Nougat, and to the best of our knowledge, battery life results are still accurate and reliable. However, the test will not produce a performance score. As more Nougat devices are released in the coming months, it’s possible that other aspects of the benchmark may encounter issues. If this happens during your testing, we encourage you to let us know.

Because MobileXPRT 2015 and BatteryXPRT 2014 performance workloads are so closely related, the next obvious question is whether MobileXPRT 2015 runs on Nougat devices. As of now, MobileXPRT 2015 does run successfully and reliably on Android 7, and this is because the most recent build of MobileXPRT 2015 was compiled using a newer SDK.

We think the best course of action for MobileXPRT 2015 and BatteryXPRT will be to eventually combine them into a single, easy-to-use Android benchmark for performance and battery life. We’ll talk more about that plan in the coming months, and we look forward to hearing your input. Until that transition is successful, we’ll continue to support both BatteryXPRT and MobileXPRT 2015.

As always, we welcome your feedback on these developments, as well as any ideas you may have for future XPRTs.

Justin

Machine learning

A couple months ago I wrote about doing an inventory of our XPRT tools. Part of that is taking a close look at the six existing XPRTs. The first result of that effort was what I recently wrote about HDXPRT. We’re also looking at emerging technology areas where the BenchmarkXPRT Community has expertise that can guide us.

One of the most exciting of these areas is machine learning. It has rapidly gone from interesting theoretical research (they called them “neural nets” back when I was getting my computer science degree) to something we all use whether we realize it or not. Machine learning (or deep learning) is in everything from intelligent home assistants to autonomous automobiles to industrial device monitoring to personalized shopping in retail environments.

The challenge with developing a benchmark for machine learning is that these are still the early days of the technology. In the past, XPRTs have targeted technologies later in the product cycle. We’re wondering how the XPRT model and the members of its community can play a role here.

One possible use of a machine-learning XPRT is with drones, a market that includes many vendors. Consumers, hobbyists, builders, and the companies creating off-the-shelf models could all benefit from tools and techniques that fairly compare drone performance.

The best approach we’ve come up with to define a machine-learning XPRT starts with identifying common areas such as computer vision, natural language processing, and data analytics, and then, within each of those areas, identifying common algorithms such as AlexNet, GoogLeNet, and VGG. We would also look at the commonly used frameworks such as Caffe, Theano, TensorFlow, and CNTK.

The result might differ from an existing XPRT where you simply run a tool and get a result. Instead, it might take the form of sample code and workloads. Or, maybe even one or two executables that could be used in the most common environments.

At this point, our biggest question is, What do you think? Is this an area you’re interested in? If so, what would you like to see a machine-learning XPRT do?

We’re actively engaging with people in these emerging markets to gauge their interest as well. Regardless of the feedback, we’re excited about the possibilities!

Bill

HDXPRT’s future

While industry pundits have written many words about the death of the PC, Windows PCs are going through a renaissance. No longer do you just choose between a desktop or a laptop in beige or black. There has been an explosion of choices.

Whether you want a super-thin notebook, a tablet, or a two-in-one device, the market has something to offer. Desktop systems can be small devices on your desk, all-in-ones with the PC built into the monitor, or old-style boxes that sit on the floor. You can go with something inexpensive that will be sufficient for many tasks or invest in a super-powerful PC capable of driving today’s latest VR devices. Or you can get a new Microsoft Surface Studio, an example of the new types of devices entering the PC scene.

The current proliferation of PC choices means that tools that help buyers understand the performance differences between systems are more important than they have been in years. Because HDXPRT is one such tool, we expect demand for it to increase.

We have many tasks ahead of us as we prepare for this increased demand. The first is to release a version of HDXPRT 2014 that doesn’t require a patch. We are working on that and should have something ready later this month.

For the other tasks, we need your input. We believe we need to update HDXPRT to reflect the world of high-definition content. It’s tempting to simply change the name to UHDXPRT, but this was our first XPRT and I’m partial to the original name. How about you?

As far as tests, what should a 2017 version of HDXPRT include? We think 4K-related workloads are a must, but aren’t sure whether 4K playback tests are the way to go. What do you think? We need to update other content, such as photo and video resolutions, and replace outdated applications with current versions. Would a VR test would be worthwhile?

Please share your thoughts with us over the coming weeks as we put together a plan for the next version of HDXPRT!

Bill

Tracking device evolution with WebXPRT ’15, part 2

Last week, we used the Apple iPhone as a test case to show how hardware advances are often reflected in benchmark scores over time. When we compared WebXPRT 2015 scores for various iPhone models, we saw a clear trend of progressively higher scores as we moved from phones with an A7 chip to phones with A8, A9, and A10 Fusion chips. Performance increases over time are not surprising, but WebXPRT ’15 scores also showed us that upgrading from an iPhone 6 to an iPhone 6s is likely to have a much greater impact on web-browsing performance than upgrading from an iPhone 6s to an iPhone 7.

This week, we’re revisiting our iPhone test case to see how software updates can boost device performance without any changes in hardware. The original WebXPRT ’15 tests for the iPhone 5s ran on iOS 8.3, and the original tests for the iPhone 6s, 6s Plus, and SE ran on variants of iOS 9. We updated each phone to iOS 10.0.2 and ran several iterations of WebXPRT ’15.

Upgrading from iOS 8.3 to iOS 10 on the iPhone 5s caused a 17% increase in web-browsing performance, as measured by WebXPRT. Upgrading from iOS 9 to iOS 10 on the iPhone 6s, 6s Plus, and SE produced web-browsing performance gains of 2.6%, 3.6%, and 3.1%, respectively.

The chart below shows the WebXPRT ’15 scores for a range of iPhones, with each iPhone’s iOS version upgrade noted in parentheses. The dark blue columns on the left represent the original scores, and the light blue columns on the right represent the upgrade scores.

Oct 27 iPhone chart

As with our hardware comparison last week, these scores are the median of a range of scores for each device in our database. These scores come both from our own testing and from device reviews from popular tech media outlets.

These results reinforce a message that we repeat often, that many factors other than hardware influence performance. Designing benchmarks that deliver relevant and reliable scores requires taking all factors into account.

What insights have you gained recently from WebXPRT ’15 testing? Let us know!

Justin

Check out the other XPRTs:

Forgot your password?