BenchmarkXPRT Blog banner

Category: Collaborative benchmark development

XPRTs in the datacenter

The XPRTs have been very successful on desktops, notebooks, tablets, and phones. People have run WebXPRT over 295,000 times. It and other benchmarks such as MobileXPRT, HDXPRT, and CrXPRT are important tools globally for evaluating device performance on various consumer and business client platforms.

We’ve begun branching out with tests for edge devices with AIXPRT, our new artificial intelligence benchmark. While typical consumers won’t be able to run AIXPRT on their devices initially, we feel that it is important for the XPRTs to play an active role in a critical emerging market. (We’ll have some updates on the AIXPRT front in the next few weeks.)

Recently, both community members and others have asked about the possibility of the XPRTs moving into the datacenter. Folks face challenges in evaluating the performance and suitability to task of such datacenter mainstays as servers, storage, networking infrastructure, clusters, and converged solutions. These challenges include the lack of easy-to-run benchmarks, the complexity and cost of the equipment (multi-tier servers, large amounts of storage, and fast networks) necessary to run tests, and confusion about best testing practices.

PT has a lot of expertise in measuring datacenter performance, as you can tell from the hundreds of datacenter-focused test reports on our website. We see great potential in our working with the BenchmarkXPRT Development Community to help in this area. It is very possible that, as with AIXPRT, our approach to datacenter benchmarks would differ from the approach we’ve taken with previous benchmarks. While we have ideas for useful benchmarks we might develop down the road, more immediate steps could be drafting white papers, developing testing guidelines, or working with vendors to set up a lab.

Right now, we’re trying to gauge the level of interest in having such tools and in helping us carry out these initiatives. What are the biggest challenges you face in datacenter-focused performance and suitability to task evaluations? Would you be willing to work with us in this area? We’d love to hear from you and will be reaching out to members of the community over the coming weeks.

As always, thanks for your help!

Bill

HDXPRT 4: A little lighter and a lot faster

This week, we’re sharing a little more about the upcoming HDXPRT 4 Community Preview. Just like previous versions of HDXPRT, HDXPRT 4 will use trial versions of commercial applications to complete workload tasks. We will include installers for some of those programs, such as Audacity and HandBrake, in the HDXPRT installation package. For other programs, such as Adobe Photoshop Elements 2018 and CyberLink Media Espresso 7.5, users will need to download the necessary installers prior to testing using  links and instructions that we will provide. The HDXPRT 4 installation package is just over 4.7 GB, slightly smaller than previous versions.

I can also report that the new version requires fewer pre-test configuration steps and a full test run takes much less time than before. Some systems that took over an hour to complete an HDXPRT 2014 run are completing HDXPRT 4 runs in about 25 minutes.

We’ll continue to provide more information as we get closer to releasing the community preview. If you’re interested in testing with HDXPRT 4 before the general release but have not yet joined the community, we invite you to join now. If you have any questions or comments about HDXPRT or the community, please contact us.

Justin

Sneak a peek at HDXPRT 4

A few months ago, we shared some details  about HDXPRT 4 development progress. Now that we’re closer to releasing a community preview build, we wanted to offer a sneak peek at the new benchmark. We may still tweak a few things during pre-release testing, but we’re close to the final look.

Below, you can see the benchmark’s new start page. After installation and completing a few brief pre-test configuration steps, running HDXPRT 4 is as easy as entering a test name and clicking the start button.

HDXPRT 4 start page

During the test, you’ll see HDXPRT’s real-world trial applications such as Adobe Photoshop Elements and CyberLink Media Espresso open and close during each iteration, though you won’t see workload graphics within the HDXPRT UI harness. When the test finishes, the results screen pops up. As you can see below, the results screen displays the overall and individual workload category scores in a straightforward and easy-to-understand manner. Below the workload scores, a button provides access to additional test and system information.

HDXPRT 4 results page

We’re not yet ready to share a date for the community preview, but we’ll provide more information in the coming weeks. As always, XPRT community previews are only available to BenchmarkXPRT Development Community members. If you’re interested in testing the HDXPRT 4 Community Preview, we invite you to join the community now. If you have any questions or comments about HDXPRT or the community, please contact us.

Justin

AI and the next MobileXPRT

As we mentioned a few weeks ago, we’re in the early planning stages for the next version of MobileXPRT—MobileXPRT 3. We’re always looking for ways to make XPRT benchmark workloads more relevant to everyday users, and a new version of MobileXPRT provides a great opportunity to incorporate emerging tech such as AI into our apps. AI is everywhere and is beginning to play a huge role in our everyday lives through smarter-than-ever phones, virtual assistants, and smart homes. The challenge for us is to identify representative mobile AI workloads that have the necessary characteristics to work well in a benchmark setting. For MobileXPRT, we’re researching AI workloads that have the following characteristics:

  • They work offline, not in the cloud.
  • They don’t require additional training prior to use.
  • They support common use cases such as image processing, optical character recognition (OCR), etc.


We’re researching the possibility of using Google’s Mobile Vision library, but there may be other options or concerns that we’re not aware of. If you have tips for places we should look, or ideas for workloads or APIs we haven’t mentioned, please let us know. We’ll keep the community informed as we narrow down our options.

Justin

Worldwide influence

On a weekly basis, we track several XPRT-related metrics such as completed test runs and downloads, content views, and social media engagement. We also search for instances where one or more of the XPRTs has been mentioned in an advertisement, article, or review. Compiling all this information gives us some insight into how, where, and how much people are using the XPRT tools.

From time to time, we share some of that data with readers and community members so they can see the impact that the XPRTs are having around the world. One way we do this is through our “XPRTs around the world” infographic, which provides a great picture of the XPRT’s reach.

Here are some key numbers from the latest update:

  • The XPRTs have been mentioned more than 11,400 times on over 3,600 unique sites.
  • Those mentions include more than 9,200 articles and reviews.
  • Those mentions originated in over 590 cities located in 65 countries on six continents. New cities of note include Phnom Penh, Cambodia; Osijek, Croatia; Kano, Nigeria; and Lahore, Pakistan.
  • The BenchmarkXPRT Development Community now includes 210 members from 74 companies and organizations around the world.


In addition to the growth in web mentions and community members, the XPRTs have now delivered more than 380,000 real-world results!

We’re grateful for everyone who’s helped us get this far! Every contribution to the community, no matter how small, contributes to our goal of providing reliable, relevant, and easy-to-use benchmark tools.

Justin

Which browser is the fastest? It’s complicated.

PCWorld recently published the results of a head-to-head browser performance comparison between Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. As we’ve noted about similar comparisons, no single browser was the fastest in every test. Browser speed sounds like a straightforward metric, but the reality is complex.

For the comparison, PCWorld used three JavaScript-centric test suites (JetStream, SunSpider, and Octane), one benchmark that simulates user actions (Speedometer), a few in-house tests of their own design, and one benchmark that simulates real-world web applications (WebXPRT). Edge came out on top in JetStream and SunSpider, Opera won in Octane and WebXPRT, and Chrome had the best results in Speedometer and PCWorld’s custom workloads.

The reason that the benchmarks rank the browsers so differently is that each one has a unique emphasis and tests a specific set of workloads and technologies. Some focus on very low-level JavaScript tasks, some test additional technologies such as HTML5, and some are designed to identify strengths or weakness by stressing devices in unusual ways. These approaches are all valid, and it’s important to understand exactly what a given score represents. Some scores reflect a very broad set of metrics, while others assess a very narrow set of tasks. Some scores help you to understand the performance you can expect from a device in your everyday life, and others measure performance in scenarios that you’re unlikely to encounter. For example, when Eric discussed a similar topic in the past, he said the tests in JetStream 1.1 provided information that “can be very useful for engineers and developers, but may not be as meaningful to the typical user.”

As we do with all the XPRTs, we designed WebXPRT to test how devices handle the types of real-world tasks consumers perform every day. While lab techs, manufacturers, and tech journalists can all glean detailed data from WebXPRT, the test’s real-world focus means that the overall score is relevant to the average consumer. Simply put, a device with a higher WebXPRT score is probably going to feel faster to you during daily use than one with a lower score. In today’s crowded tech marketplace, that piece of information provides a great deal of value to many people.

What are your thoughts on browser testing? We’d love to hear from you.

Justin

Check out the other XPRTs:

Forgot your password?