BenchmarkXPRT Blog banner

Search Results for: webxprt

The value of speed

I was reading an interesting article on how high-end smartphones like the iPhone X, Pixel 2 XL, and Galaxy S8 generate more money from in-game revenue than cheaper phones do.

One line stood out to me: “With smartphones becoming faster, larger and more capable of delivering an engaging gaming experience, these monetization key performance indicators (KPIs) have begun to increase significantly.”

It turns out the game companies totally agree with the rest of us that faster devices are better!

Regardless of who is seeking better performance—consumers or game companies—the obvious question is how you determine which models are fastest. Many folks rely on device vendors’ claims about how much faster the new model is. Unfortunately, the vendors’ claims don’t always specify on what they base the claims. Even when they do, it’s hard to know whether the numbers are accurate and applicable to how you use your device.

The key part of any answer is performance tools that are representative, dependable, and open.

  • Representative – Performance tools need to have realistic workloads that do things that you care about.
  • Dependable – Good performance tools run reliably and produce repeatable results, both of which require that significant work go into their development and testing.
  • Open – Performance tools that allow people to access the source code, and even contribute to it, keep things above the table and reassure you that you can rely on the results.

Our goal with the XPRTs is to provide performance tools that meet all these criteria. WebXPRT 3 and all our other XPRTs exist to help accurately reveal how devices perform. You can run them yourself or rely on the wealth of results that we and others have collected on a wide array of devices.

The best thing about good performance tools is that everyone, even vendors, can use them. I sincerely hope that you find the XPRTs helpful when you make your next technology purchase.

Bill

How the XPRTs handle your data

Data privacy is a hot topic in the news these days. In an ideal world, all applications and websites that have access to users’ sensitive personal information would treat that information with respect. Users could be confident that their data would not be abused, secretly transmitted to third parties, or used as a launchpad for extensive violations of their privacy.

In the real world, the situation is often quite different, but not with the XPRTs. Just as we strive for transparency during the benchmark development process, we try to be completely upfront regarding how we handle your personal data. We’re committed to the principle that your personal information belongs to you and no one else. We don’t gather, store, or disseminate any of your data without your knowledge and consent, and we never try to trick you with misleading terms or pages of legal jargon that few will ever read. We take that commitment very seriously.

To join the BenchmarkXPRT Development Community, you need to provide only your first and last name, corporate affiliation, and valid email address. These pieces of information form your profile, which other members of the community can view only if you choose to participate in the BenchmarkXPRT forum. We will not share your profile information, or even the fact that you are a member, with anyone outside the community. If you do not participate in the forum, even other community members will not know you are a member.

For the XPRT apps, we gather data only for the purposes of ensuring quality and improving the benchmark. Two apps—WebXPRT and CrXPRT—collect test results and some data about the browser and device that produced those results. While we may refer to high and low scores or averages in the materials we publish, we will never make an individual WebXPRT result public unless the tester requests we do so. These apps report no identifying personal or corporate data, or any other potentially confidential information.

None of the remaining XPRT apps—BatteryXPRT, HDXPRT, MobileXPRT, and TouchXPRT—collect any data. When an individual runs one of these apps, we see no information or results unless they submit a result for publication or send us a direct message mentioning the result. This makes the XPRTs ideal for pre-production OEM testing because there’s no risk of model or performance information leaking to the press after automatically appearing on a benchmark’s website.

You can read more about how we handle your data on our privacy policy page. If you have any questions, please feel free to ask!

Justin

The XPRTs in action

In the near future, we’ll update our “XPRTs around the world” infographic, which provides a snapshot of how people are using the XPRTs worldwide. Among other stats, we include the number of XPRT web mentions, articles, and reviews that have appeared during a given period. Recently, we learned how one of those statistics—a single web site mention of WebXPRT—found its way to consumers in more places than we would have imagined.

Late last month, AnandTech published a performance comparison by Andrei Frumusanu examining the Samsung Galaxy S9’s Snapdragon 845 and Exynos 9810 variants and a number of other high-end phones. WebXPRT was one of the benchmarking tools used. The article stated that both versions of the brand-new S9 were slower than the iPhone X and, in some tests, were slower than even the iPhone 7.

A CNET video discussed the article and the role of WebXPRT in the performance comparison, and the article has been reposted to hundreds of tech media sites around the world. A quick survey shows reposts in Albania, Bulgaria, Denmark, Chile, the Czech Republic, France, Germany, Greece, Indonesia, Iran, Italy Japan, Korea, Poland, Russia, Spain, Slovakia, Turkey, and many other countries.

The popularity of the article is not surprising, for it positions the newest flagship phones from the industry’s two largest phone makers in a head-to-head comparison with a somewhat unexpected outcome. AnandTech did nothing to stir controversy or sensationalize the test results, but simply provided readers with an objective, balanced assessment of how these devices compare so that they could draw their own conclusions. The XPRTs share this approach.

We’re grateful to Andrei and others at AnandTech who’ve used the XPRTs over the years to produce content that helps consumers make informed decisions. WebXPRT is just part of AnandTech’s toolkit, but it’s one that’s accessible to anybody free of charge. With the help of BenchmarkXPRT Development Community members, we’ll continue to publish XPRT tools that help users everywhere gain valuable insight into device performance.

Justin

Just before showtime

In case you missed the announcement, WebXPRT 3 is now live! Please try it out, submit your test results, and feel free to send us your questions or comments.

During the final push toward launch day, it occurred to us that not all of our readers are aware of the steps involved in preparing a benchmark for general availability (GA). Here’s a quick overview of what we did over the last several weeks to prepare for the WebXPRT 3 release, a process that follows the same approach we use for all new XPRTs.

After releasing the community preview (CP), we started on the final build. During this time, we incorporated features that we were not able to include in the CP and fixed a few outstanding issues. Because we always try to make sure that CP results are comparable to eventual GA results, these issues rarely involve the workloads themselves or anything that affects scoring. In the case of WebXPRT 3, the end-of-test results submission form was not fully functional in the CP, so we finished making it ready for prime time.

The period between CP and GA releases is also a time to incorporate any feedback we get from the community during initial testing. One of the benefits of membership in the BenchmarkXPRT Development Community is access to pre-release versions of new benchmarks, along with an opportunity to make your voice heard during the development process.

When the GA candidate build is ready, we begin two types of extensive testing. First, our quality assurance (QA) team performs a thorough review, running the build on numerous devices. In the case of WebXPRT, it also involves testing with multiple browsers. The QA team also keeps a sharp eye out for formatting problems and bugs.

The second type of testing involves comparing the current version of the benchmark with prior versions. We tested WebXPRT 3 on almost 100 devices. While WebXPRT 2015 and WebXPRT 3 scores are not directly comparable, we normalize scores for both sets of results and check that device performance is scaling in the same way. If it isn’t, we need to determine why not.

Finally, after testing is complete and the new build is ready, we finalize all related documentation and tie  the various pieces together on the web site. This involves updating the main benchmark page and graphics, the FAQ page, the results tables, and the members’ area.

That’s just a brief summary of what we’ve been up to with WebXPRT in the last few weeks. If you have any questions about the XPRTs or the development community, feel free to ask!

Justin

A smooth transition

We want to thank Andrei Frumusanu of AnandTech for mentioning WebXPRT 3 in the System Performance section of their Snapdragon 845 review. For testing labs and tech media, incorporating a new benchmark into a test suite can be daunting, and they don’t make the decision to do so lightly. Once a new benchmark is in play, the score database used for comparisons is suddenly empty, and a lot of testing needs to happen before anyone can compare devices on a large scale.

In the BenchmarkXPRT Development Community, we’ve designed our development and release system to minimize the stress involved in adopting new benchmark tools. A key part of that strategy is releasing community previews to members several weeks before the general release. When we release a community preview, we include no publication restrictions and we work to make sure that preview results will be comparable to results from the general release. Between a community preview and a general release, we may still tweak the UI or fix issues with non-workload-related features, but you can be sure that the results will still be good after the general release.

The community preview system allows us to solicit feedback from an expanded base of pre-release testers, but it also allows labs to backfill results for legacy devices and get a head start on incorporating the new benchmark into their testing suites.

Speaking of previews, WebXPRT 3 community preview testing is going well and we’re excited about the upcoming release. If you’d like to learn more about our development community and how you can join, send us your questions and we’ll be happy to help.

Justin

A look back at 2017

At the beginning of each new year, we like to look back on the previous 12 months and review everything that we accomplished. Here’s a quick recap of an eventful 2017 for the XPRTs:

We continued our tradition of travelling to the world’s biggest tech expos. Bill went to CES in Las Vegas and MWC Shanghai, and Mark attended MWC in Barcelona. Travelling to these shows provides us with great opportunities to monitor industry trends and meet with other BenchmarkXPRT Development Community members.

We also continued work on our suite of benchmark tools and related resources. We updated BatteryXPRT in response to changes in the Android development environment, released two new HDXPRT builds to make installation and test configuration easier on new Windows 10 builds, and released the much-anticipated WebXPRT 3 Community Preview.

We released new media, including a video about our sponsorship of a team of North Carolina State University students who created an experimental VR demo workload, and new interactive tools, such as the XPRT Selector tool and the WebXPRT Processor Comparison Chart.

We continued to improve the XPRT Weekly Tech Spotlight by adding more devices, photos, and site features. We put 51 devices in the spotlight throughout the year and published updated back-to-school, Black Friday, and holiday showcases to help buyers compare devices.

At the end of the year, our most popular benchmark, WebXPRT, passed the 200,000-run milestone. WebXPRT use continues to grow around the world, and it has truly become an industry-standard performance benchmark for OEM labs, vendors, and leading tech press outlets.

We also continued work on our most challenging project to date, a benchmark for machine learning. We look forward to sharing more information on that effort with the community over the next few months.

We’re thankful for everyone who used the XPRTs, joined the community, and sent in questions and suggestions throughout 2017. Each one of you helped to make the BenchmarkXPRT Development Community a strong, vibrant, and relevant resource for people all around the world. Here’s to a great 2018!

Justin

Check out the other XPRTs:

Forgot your password?