BenchmarkXPRT Blog banner

Category: Collaborative benchmark development

HDXPRT 4: A little lighter and a lot faster

This week, we’re sharing a little more about the upcoming HDXPRT 4 Community Preview. Just like previous versions of HDXPRT, HDXPRT 4 will use trial versions of commercial applications to complete workload tasks. We will include installers for some of those programs, such as Audacity and HandBrake, in the HDXPRT installation package. For other programs, such as Adobe Photoshop Elements 2018 and CyberLink Media Espresso 7.5, users will need to download the necessary installers prior to testing using  links and instructions that we will provide. The HDXPRT 4 installation package is just over 4.7 GB, slightly smaller than previous versions.

I can also report that the new version requires fewer pre-test configuration steps and a full test run takes much less time than before. Some systems that took over an hour to complete an HDXPRT 2014 run are completing HDXPRT 4 runs in about 25 minutes.

We’ll continue to provide more information as we get closer to releasing the community preview. If you’re interested in testing with HDXPRT 4 before the general release but have not yet joined the community, we invite you to join now. If you have any questions or comments about HDXPRT or the community, please contact us.

Justin

Sneak a peek at HDXPRT 4

A few months ago, we shared some details  about HDXPRT 4 development progress. Now that we’re closer to releasing a community preview build, we wanted to offer a sneak peek at the new benchmark. We may still tweak a few things during pre-release testing, but we’re close to the final look.

Below, you can see the benchmark’s new start page. After installation and completing a few brief pre-test configuration steps, running HDXPRT 4 is as easy as entering a test name and clicking the start button.

HDXPRT 4 start page

During the test, you’ll see HDXPRT’s real-world trial applications such as Adobe Photoshop Elements and CyberLink Media Espresso open and close during each iteration, though you won’t see workload graphics within the HDXPRT UI harness. When the test finishes, the results screen pops up. As you can see below, the results screen displays the overall and individual workload category scores in a straightforward and easy-to-understand manner. Below the workload scores, a button provides access to additional test and system information.

HDXPRT 4 results page

We’re not yet ready to share a date for the community preview, but we’ll provide more information in the coming weeks. As always, XPRT community previews are only available to BenchmarkXPRT Development Community members. If you’re interested in testing the HDXPRT 4 Community Preview, we invite you to join the community now. If you have any questions or comments about HDXPRT or the community, please contact us.

Justin

AI and the next MobileXPRT

As we mentioned a few weeks ago, we’re in the early planning stages for the next version of MobileXPRT—MobileXPRT 3. We’re always looking for ways to make XPRT benchmark workloads more relevant to everyday users, and a new version of MobileXPRT provides a great opportunity to incorporate emerging tech such as AI into our apps. AI is everywhere and is beginning to play a huge role in our everyday lives through smarter-than-ever phones, virtual assistants, and smart homes. The challenge for us is to identify representative mobile AI workloads that have the necessary characteristics to work well in a benchmark setting. For MobileXPRT, we’re researching AI workloads that have the following characteristics:

  • They work offline, not in the cloud.
  • They don’t require additional training prior to use.
  • They support common use cases such as image processing, optical character recognition (OCR), etc.


We’re researching the possibility of using Google’s Mobile Vision library, but there may be other options or concerns that we’re not aware of. If you have tips for places we should look, or ideas for workloads or APIs we haven’t mentioned, please let us know. We’ll keep the community informed as we narrow down our options.

Justin

Worldwide influence

On a weekly basis, we track several XPRT-related metrics such as completed test runs and downloads, content views, and social media engagement. We also search for instances where one or more of the XPRTs has been mentioned in an advertisement, article, or review. Compiling all this information gives us some insight into how, where, and how much people are using the XPRT tools.

From time to time, we share some of that data with readers and community members so they can see the impact that the XPRTs are having around the world. One way we do this is through our “XPRTs around the world” infographic, which provides a great picture of the XPRT’s reach.

Here are some key numbers from the latest update:

  • The XPRTs have been mentioned more than 11,400 times on over 3,600 unique sites.
  • Those mentions include more than 9,200 articles and reviews.
  • Those mentions originated in over 590 cities located in 65 countries on six continents. New cities of note include Phnom Penh, Cambodia; Osijek, Croatia; Kano, Nigeria; and Lahore, Pakistan.
  • The BenchmarkXPRT Development Community now includes 210 members from 74 companies and organizations around the world.


In addition to the growth in web mentions and community members, the XPRTs have now delivered more than 380,000 real-world results!

We’re grateful for everyone who’s helped us get this far! Every contribution to the community, no matter how small, contributes to our goal of providing reliable, relevant, and easy-to-use benchmark tools.

Justin

Which browser is the fastest? It’s complicated.

PCWorld recently published the results of a head-to-head browser performance comparison between Google Chrome, Microsoft Edge, Mozilla Firefox, and Opera. As we’ve noted about similar comparisons, no single browser was the fastest in every test. Browser speed sounds like a straightforward metric, but the reality is complex.

For the comparison, PCWorld used three JavaScript-centric test suites (JetStream, SunSpider, and Octane), one benchmark that simulates user actions (Speedometer), a few in-house tests of their own design, and one benchmark that simulates real-world web applications (WebXPRT). Edge came out on top in JetStream and SunSpider, Opera won in Octane and WebXPRT, and Chrome had the best results in Speedometer and PCWorld’s custom workloads.

The reason that the benchmarks rank the browsers so differently is that each one has a unique emphasis and tests a specific set of workloads and technologies. Some focus on very low-level JavaScript tasks, some test additional technologies such as HTML5, and some are designed to identify strengths or weakness by stressing devices in unusual ways. These approaches are all valid, and it’s important to understand exactly what a given score represents. Some scores reflect a very broad set of metrics, while others assess a very narrow set of tasks. Some scores help you to understand the performance you can expect from a device in your everyday life, and others measure performance in scenarios that you’re unlikely to encounter. For example, when Eric discussed a similar topic in the past, he said the tests in JetStream 1.1 provided information that “can be very useful for engineers and developers, but may not be as meaningful to the typical user.”

As we do with all the XPRTs, we designed WebXPRT to test how devices handle the types of real-world tasks consumers perform every day. While lab techs, manufacturers, and tech journalists can all glean detailed data from WebXPRT, the test’s real-world focus means that the overall score is relevant to the average consumer. Simply put, a device with a higher WebXPRT score is probably going to feel faster to you during daily use than one with a lower score. In today’s crowded tech marketplace, that piece of information provides a great deal of value to many people.

What are your thoughts on browser testing? We’d love to hear from you.

Justin

An update on HDXPRT development

It’s been a while since we updated the community on HDXPRT development, and we’ve made a lot of progress since then. Here’s a quick summary of where we are and what to expect in the coming months.

The benchmark’s official name will be HDXPRT 4, and we’re sticking with the basic plan we outlined in the blog, which includes updating the benchmark’s real-world trial applications and workload content and improving the UI.

We’ve updated Adobe Photoshop Elements, Audacity, CyberLink Media Espresso, and HandBrake to more contemporary versions, but decided the benchmark will no longer use Apple iTunes. We sometimes encountered problems with iTunes during testing, and because we can complete the audio-related workloads using Audacity, we decided that it was OK to remove iTunes from the test. Please contact us if you have any concerns about this decision.

In addition to the editing photos, editing music, and converting videos workloads from prior versions of the benchmark, HDXPRT 4 includes two new Photoshop Elements scenarios. The first utilizes an AI tool that corrects closed eyes in photos and the second creates a single panoramic photo from seven separate photos. For the photo and video workloads, we produced new high-res photo content and 4K GoPro video footage respectively.

For the UI, our goal is to implement a clean and functional design and align it more closely with the themes, colors, and font styles we’ll be implementing in the XPRTs moving forward. The WebXPRT 3 UI will give you a feel for the direction the HDXPRT UI is headed.

Some of these details may change as we test preliminary builds, but we wanted to give you a better sense of where HDXPRT is headed. We’re not ready to share a date for the community preview, but will provide more details as the day approaches.

If you have any questions or comments about HDXPRT, please let us know. It’s not too late to for us to consider your input for HDXPRT 4.

Justin

Check out the other XPRTs:

Forgot your password?