BenchmarkXPRT Blog banner

Category: Cross-platform benchmarks

An update on AIXPRT development

It’s been almost two months since the AIXPRT Community Preview went live, and we want to provide folks with a quick update. Community Preview periods for the XPRTs generally last about a month. Because of the complexity of AIXPRT and some of the feedback we’ve received, we plan to release a second AIXPRT Community Preview (CP2) later this month.

One of the biggest additions in CP2 will be the ability to run AIXPRT on Windows. AIXPRT currently requires test systems to run Ubuntu 16.04 LTS. This is fine for testers accustomed to Linux environments, but presents obstacles for those who want to test in a traditional Windows environment. We will not be changing the tests themselves, so this update will not influence existing results from Ubuntu. We plan to make CP2 available for download from the BenchmarkXPRT website for people who don’t wish to deal with GitHub.

Also, after speaking with testers and learning more about the kinds of data points people are looking for in AIXPRT results, we’ve decided to make significant adjustments to the AIXPRT results viewer. To make it easier for visitors to find what they’re looking for, we’ll add filters for key categories such as batch size, toolkit, and latency percentile (e.g., 50th, 90th, 99th), among others. We’ll also allow users to set desired ranges for metrics such as throughput and latency.

Finally, we’re adding a demo mode that displays some images and other information on the screen while a test is running to give users a better idea what is happening. While we haven’t seen results change while running in demo mode, users should not publish demo results or use them for comparison.

We hope to release CP2 in the second half of May and a GA version in mid-June. However, this project has more uncertainties than we usually encounter with the XPRTs, so that timeline could easily change.

We’ll continue to keep everyone up to date with AIXPRT news here in the blog. As always, we appreciate your suggestions. If you have any questions or comments about AIXPRT, please let us know.

Bill

A new playing field for WebXPRT

WebXPRT is one of the go-to benchmarks for evaluating browser performance, so we’re always interested in browser development news. Recently, Microsoft created a development channel where anyone can download early versions of an all-new Microsoft Edge browser. Unlike previous versions of Edge, Microsoft constructed the new browser using the Chromium open-source project, the same foundation underlying the Google Chrome browser and Chrome OS.

One interesting aspect of the new Edge development strategy is the changes that Microsoft is making to more than 50 services that Chromium has included. If you use Chrome daily, you’ve likely become accustomed to certain built-in services such as ad block, spellcheck, translate, maps integration, and form fill, among many others. While each of these is useful, a large number of background services running simultaneously can slow browsing and sap battery life. In the new Edge, Microsoft is either reworking each service or removing it altogether, with the hope of winning users by providing a cleaner, faster, and more power-efficient experience. You can read more about Microsoft’s goals for the new project on the Microsoft Edge Insider site.

As we’ve discussed before, many factors contribute to the speed of a browsing experience and its WebXPRT score. It’s too early to know how the new Microsoft Edge will stack up against other browsers, but when the full version comes out of development, you can be sure that we’ll be publishing some comparison scores. I’ve installed the Dev Channel version of Edge on my personal machine and run WebXPRT 3. While I can’t publish the scores from this early version, I can tell you that the results were interesting. Have you run WebXPRT 3 on the new Microsoft Edge? How do you think it compares to competitors? We’d love to hear your thoughts.

Justin

An update on the AIXPRT Request for Comments preview

As we approach the end of the original feedback window for the AIXPRT Request for Comments preview build, we want to update folks on the status of the project and what to expect in the coming weeks.

First, thanks to those who’ve downloaded the AIXPRT OpenVINO package and sent in their questions and comments. We value your feedback, and it’s instrumental in making AIXPRT a better tool. We’re currently working through some issues with the TensorFlow and TensorRT packages, and hope to add support for those to the RFC preview build repository very soon.

We’re also hoping to have a full-fledged community preview (CP) ready in mid to late February. Like our other community previews, the AIXPRT CP would be solid enough to allow folks to start quoting numbers. We typically make our benchmarks available to the general public four to six weeks after the community preview period begins, so if that schedule holds, it would place the public AIXPRT release around the end of March.

In light of the schedule described above, you still have time to gain access to the AIXPRT RFC preview build and give your feedback, so let us know if you’d like to check it out. The installation and testing process can take less than an hour, but getting everything properly set up can take a few tries. We are hard at work trying to make that process more straightforward. We welcome your input on all aspects of the benchmark, including workloads, ease of use, metrics, scores, and reporting.

Thanks for your help!

Justin

WebXPRT in action

Just this past summer, WebXPRT passed the 250,000-run milestone, and since then, the run total has already passed 330,000. September was our biggest month ever, with over 28,000 WebXPRT runs! We sometimes like to show the community how far a reach the XPRTs have around the world by reporting the latest stats on the number of articles and reviews that mention the XPRTs, and the fact is that most of those mentions involve WebXPRT. Today, I thought it would be interesting to bring the numbers to life and provide a glimpse of how the tech press uses WebXPRT. Here’s a sample of WebXPRT in action during the past couple of weeks.



While WebXPRT continues to be a useful tool for tech enthusiasts around the world, you don’t have to be a tech expert to benefit from it. If you’d like to know more about WebXPRT, check out our recent video, What is WebXPRT and why should I care?

Justin

Check out our new WebXPRT video!

At over 305,000 runs and counting, WebXPRT is our most popular benchmark app. Device manufacturers, tech journalists, and developers around the world use WebXPRT because test runs are quick and easy, it runs on almost anything with a web browser, and it provides reliable data about how well devices perform when completing real-world tasks.

WebXPRT is not just for “techies,” however. To help explain what WebXPRT does and why it matters to everyday consumers, we’ve published a new video, What is WebXPRT and why should I care? The video explains the concepts behind some of WebXPRT’s workloads and how even small delays in common online tasks can add up to big headaches and a significant amount of wasted time. We all want to avoid those problems, and WebXPRT can help anyone that wants to see how their device, or a new device they’re thinking about buying, stacks up against the alternatives. We encourage you to check out the video below, which you can also find on YouTube and WebXPRT.com. If you have any questions about WebXPRT, please let us know!

Justin

What is WebXPRT and why should I care?

Which browser is the fastest? It’s complicated.

PCWorld recently published the results of a head-to-head browser performance comparison between Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. As we’ve noted about similar comparisons, no single browser was the fastest in every test. Browser speed sounds like a straightforward metric, but the reality is complex.

For the comparison, PCWorld used three JavaScript-centric test suites (JetStream, SunSpider, and Octane), one benchmark that simulates user actions (Speedometer), a few in-house tests of their own design, and one benchmark that simulates real-world web applications (WebXPRT). Edge came out on top in JetStream and SunSpider, Opera won in Octane and WebXPRT, and Chrome had the best results in Speedometer and PCWorld’s custom workloads.

The reason that the benchmarks rank the browsers so differently is that each one has a unique emphasis and tests a specific set of workloads and technologies. Some focus on very low-level JavaScript tasks, some test additional technologies such as HTML5, and some are designed to identify strengths or weakness by stressing devices in unusual ways. These approaches are all valid, and it’s important to understand exactly what a given score represents. Some scores reflect a very broad set of metrics, while others assess a very narrow set of tasks. Some scores help you to understand the performance you can expect from a device in your everyday life, and others measure performance in scenarios that you’re unlikely to encounter. For example, when Eric discussed a similar topic in the past, he said the tests in JetStream 1.1 provided information that “can be very useful for engineers and developers, but may not be as meaningful to the typical user.”

As we do with all the XPRTs, we designed WebXPRT to test how devices handle the types of real-world tasks consumers perform every day. While lab techs, manufacturers, and tech journalists can all glean detailed data from WebXPRT, the test’s real-world focus means that the overall score is relevant to the average consumer. Simply put, a device with a higher WebXPRT score is probably going to feel faster to you during daily use than one with a lower score. In today’s crowded tech marketplace, that piece of information provides a great deal of value to many people.

What are your thoughts on browser testing? We’d love to hear from you.

Justin

Check out the other XPRTs:

Forgot your password?