BenchmarkXPRT Blog banner

Search Results for: webxprt

The XPRT Women Code-a-Thon

As Justin explained last week, we’ve resolved the issue we found with the TouchXPRT CP. I’m happy to say that the testing went well and that we released CP3 this week.

It’s been only three weeks since we announced the XPRT Weekly Tech Spotlight, and we already have another big announcement! Principled Technologies has joined with ChickTech Seattle to host the first ever XPRT Women Code-a-Thon! In this two-day event, participants will compete to create the best new candidate workload for WebXPRT or MobileXPRT. The workloads can’t duplicate existing workloads, so we are looking forward to seeing the new ideas.

Judges will study all the workloads and award prizes to the top three: $2,500 for first place, $1,500 for second place, and $1,000 for third place. Anyone interested can register here.

PT and the BenchmarkXPRT Development Community are committed to promoting the advancement of women in STEM, but we also win by doing good. As with the NCSU senior project, the BenchmarkXPRT Development Community will get some fresh perspectives and some new experimental test tools. Everyone wins!

So much has happened in 2016 and January isn’t even over yet. The year is off to a great start!

Eric

Auf Deutsche

Early next week, we will update WebXPRT by adding a German UI. This brings the number of available languages to three. WebXPRT has had a Simplified Chinese UI for a while, but you had to click a link on the WebXPRT page to get it. The new version removes that limitation, and lets you select Simplified Chinese, English, or German from the UI.

WebXPRT '15 German

We’re working on getting WebXPRT to automatically detect the language of your device, but for now, the UI defaults to English.

We would like to expand the range of languages the XPRTs support over time. This is an area where you can help. If you’d like to see your language represented and are willing to help with translation, please let us know.

I know it’s the holiday season, but remember that CES will be here before we know it. I’m really looking forward to seeing the show, and I may have some big news to talk about while I’m there! If you’re planning to be at CES, send a message and let’s find a time to meet!

We will not have a blog post next week. Happy holidays!

Eric

Last week in the XPRTs
We published the December 2015 BenchmarkXPRT Development Community newsletter.
We added one new BatteryXPRT ’14 result.
We added nine new MobileXPRT ’13 results.
We added one new MobileXPRT ’15 result.
We added four new WebXPRT ’15 results.

Please let us know

Todd Reifsteck from the Web Platform Team at Microsoft was kind enough to let me share a conversation we had last week:

Todd reported he was having problems running WebXPRT on the Edge browser. This was a surprise to us, as we’d already released a WebXPRT update to resolve Edge browser issues.

We were not seeing this problem, and as we talked with Todd we verified there was no issue in WebXPRT itself. The fix we released was working; however, we found a path through the web site that launched the previous version of WebXPRT. Once we fixed that URL to point to the latest version of WebXPRT, Todd reported that WebXPRT was working with Edge, just as we expected.

This problem would not have affected results on other browsers. The results from the previous version of WebXPRT are comparable to the current version. Compatibility with the Edge browser is the only difference between the versions.

Thanks to Todd for his help. As always, we encourage you to contact us if you have any issues or questions. We’ll do our best to resolve them as quickly as possible.

Eric

Five years later…

Five years ago this month, we started what we then called the HDXPRT Development Community. The first benchmark, HDXPRT 2011, appeared six months later. A LOT has happened since then.

We’ve grown to six benchmarks (HDXPRT, TouchXPRT, WebXPRT, MobileXPRT, BatteryXPRT, and CrXPRT) with plenty of updates. As a result, we had to change the name to the BenchmarkXPRT Development Community, though we’ve come to refer to the benchmarks themselves as the XPRTs.

To tell the world about the XPRTs and the BenchmarkXPRT Development Community, we’ve written over 200 blog entries. We’ve created videos, a training course, infographics, and white papers. We’ve met members of the community at their companies, via webinars, and at trade shows. We’ve quite literally traveled around the world to shows in Las Vegas, Barcelona, Taipei, and Shenzhen.

As a result, the community has grown to about 150 individuals at over 60 companies and organizations. People have downloaded or run the benchmarks over 100,000 times in over 44 countries. The XPRTs have been cited over 3,800 times in a wide variety of websites around the world.

Yes, a lot has happened over these five years.

On behalf of the BenchmarkXPRT team, I want to convey my sincere thanks to all of you for your involvement over these years. I’m really looking forward to what the next five years will look like. We’re just getting started!

More than the sum of its parts

There was a recent article in Bloomberg about phone maker ZTE’s increasing market share in the US. The article singled out one phone, the ZTE Maven, which costs about $60 (US).

This phrase jumped out at me: “a processor with capabilities somewhere between the iPhone 5 and 6.” The iPhone 5S could also fit that description. The ZTE Maven uses the ARM Cortex-A53, 64-bit processor running at 1.2 GHz. The Apple iPhone 5s uses the Apple Cyclone-A7 Cortex-A7 Harvard Superscalar processor running at 1.3 GHz.

We decided to put that statement to the test. We ran WebXPRT 2015 on the ZTE Maven and its score was 47. The iPhone 5s scored 100. The Maven was not even close.

As we’ve said before, the performance of a device depends on more than the GHz of its processor. For example, the ZTE Maven uses the Snapdragon 410 SoC, which was aimed at mid-level devices. The iPhone 5s uses the Apple A7, which was intended for higher-end devices.  You can find side by side specs here.

Be wary when you see unsupported performance claims. As this example shows, specs can appear comparable even when the actual performance of the devices differs considerably. A good benchmark can provide insights into performance that specs alone can’t.

Eric

Question we get a lot

“How come your benchmark ranks devices differently than [insert other benchmark here]?” It’s a fair question, and the reason is that each benchmark has its own emphasis and tests different things. When you think about it, it would be unusual if all benchmarks did agree.

To illustrate the phenomenon, consider this excerpt from a recent browser shootout in VentureBeat:

 
While this looks very confusing, the simple explanation is that the different benchmarks are testing different things. To begin with, SunSpider, Octane, JetStream, PeaceKeeper, and Kraken all measure JavaScript performance. Oort Online measures WebGL performance. WebXPRT measures both JavaScript and HTML 5 performance. HTML5Test measures HTML5 compliance.

Even with benchmarks that test the same aspect of browser performance, the tests differ. Kraken and SunSpider both test the speed of JavaScript math, string, and graphics operations in isolation, but run different sets of tests to do so. PeaceKeeper profiles the JavaScript from sites such as YouTube and FaceBook.

WebXPRT, like the other XPRTs, uses scenarios that model the types of work people do with their devices.

It’s no surprise that the order changes depending on which aspect of the Web experience you emphasize, in much the same way that the most fuel-efficient cars might not be the ones with the best acceleration.

This is a bigger topic than we can deal with in a single blog post, and we’ll examine it more in the future.

Eric

Check out the other XPRTs:

Forgot your password?