BenchmarkXPRT Blog banner

Author Archives: Justin Greene

Sneak a peek at HDXPRT 4

A few months ago, we shared some details  about HDXPRT 4 development progress. Now that we’re closer to releasing a community preview build, we wanted to offer a sneak peek at the new benchmark. We may still tweak a few things during pre-release testing, but we’re close to the final look.

Below, you can see the benchmark’s new start page. After installation and completing a few brief pre-test configuration steps, running HDXPRT 4 is as easy as entering a test name and clicking the start button.

HDXPRT 4 start page

During the test, you’ll see HDXPRT’s real-world trial applications such as Adobe Photoshop Elements and CyberLink Media Espresso open and close during each iteration, though you won’t see workload graphics within the HDXPRT UI harness. When the test finishes, the results screen pops up. As you can see below, the results screen displays the overall and individual workload category scores in a straightforward and easy-to-understand manner. Below the workload scores, a button provides access to additional test and system information.

HDXPRT 4 results page

We’re not yet ready to share a date for the community preview, but we’ll provide more information in the coming weeks. As always, XPRT community previews are only available to BenchmarkXPRT Development Community members. If you’re interested in testing the HDXPRT 4 Community Preview, we invite you to join the community now. If you have any questions or comments about HDXPRT or the community, please contact us.

Justin

AI and the next MobileXPRT

As we mentioned a few weeks ago, we’re in the early planning stages for the next version of MobileXPRT—MobileXPRT 3. We’re always looking for ways to make XPRT benchmark workloads more relevant to everyday users, and a new version of MobileXPRT provides a great opportunity to incorporate emerging tech such as AI into our apps. AI is everywhere and is beginning to play a huge role in our everyday lives through smarter-than-ever phones, virtual assistants, and smart homes. The challenge for us is to identify representative mobile AI workloads that have the necessary characteristics to work well in a benchmark setting. For MobileXPRT, we’re researching AI workloads that have the following characteristics:

  • They work offline, not in the cloud.
  • They don’t require additional training prior to use.
  • They support common use cases such as image processing, optical character recognition (OCR), etc.


We’re researching the possibility of using Google’s Mobile Vision library, but there may be other options or concerns that we’re not aware of. If you have tips for places we should look, or ideas for workloads or APIs we haven’t mentioned, please let us know. We’ll keep the community informed as we narrow down our options.

Justin

The 2018 XPRT Spotlight Back-to-School Roundup

With the new school year approaching, we’re pleased to announce that we’ve just published our third annual XPRT Spotlight Back-to-School Roundup! The Roundup allows shoppers to view side-by-side comparisons of XPRT test scores and hardware specs from some of this school year’s most popular Chromebooks, laptops, tablets, and convertibles. We tested the devices in our lab using the XPRT benchmarks and provide not only performance scores, but also photo galleries, PT-verified device specs, and prices. Parents, teachers, students, and administrators who are considering purchases for use in their education environments have many options, and the Roundup can help make their decisions easier by gathering product and performance facts in one convenient place. We’ll continue to add devices to the Roundup, including the new Microsoft Surface Go, as they arrive over the next few weeks.

The Back-to-School Roundup is just one of the features we offer through the XPRT Weekly Tech Spotlight. Every week, the Spotlight highlights a new device, making it easier for consumers to select a new laptop, phone, tablet, or PC. Recent devices in the Spotlight include the Acer Chromebook Tab 10, the 2018 Samsung Chromebook Plus, the Intel Hades Canyon NUC mini PC, and the OnePlus 6 phone. The Spotlight device comparison page lets you view side-by-side comparisons of all of the devices we’ve tested.

If you’re interested in having your devices featured in the XPRT Weekly Tech Spotlight or in this year’s Black Friday and Holiday Showcases, which we publish in late November, visit the website for more details.

If you have any ideas for the Spotlight page or suggestions for devices you’d like to see, let us know!

Justin

Worldwide influence

On a weekly basis, we track several XPRT-related metrics such as completed test runs and downloads, content views, and social media engagement. We also search for instances where one or more of the XPRTs has been mentioned in an advertisement, article, or review. Compiling all this information gives us some insight into how, where, and how much people are using the XPRT tools.

From time to time, we share some of that data with readers and community members so they can see the impact that the XPRTs are having around the world. One way we do this is through our “XPRTs around the world” infographic, which provides a great picture of the XPRT’s reach.

Here are some key numbers from the latest update:

  • The XPRTs have been mentioned more than 11,400 times on over 3,600 unique sites.
  • Those mentions include more than 9,200 articles and reviews.
  • Those mentions originated in over 590 cities located in 65 countries on six continents. New cities of note include Phnom Penh, Cambodia; Osijek, Croatia; Kano, Nigeria; and Lahore, Pakistan.
  • The BenchmarkXPRT Development Community now includes 210 members from 74 companies and organizations around the world.


In addition to the growth in web mentions and community members, the XPRTs have now delivered more than 380,000 real-world results!

We’re grateful for everyone who’s helped us get this far! Every contribution to the community, no matter how small, contributes to our goal of providing reliable, relevant, and easy-to-use benchmark tools.

Justin

Planning the next version of MobileXPRT

We’re in the early planning stages for the next version of MobileXPRT, and invite you to send us any suggestions you may have. What do you like or not like about MobileXPRT? What features would you like to see in a new version?

When we begin work on a new version of any XPRT, one of the first steps we take is to assess the benchmark’s workloads to determine whether they will provide value during the years ahead. This step almost always involves updating test content such as photos and videos to more contemporary file resolutions and sizes, and it can also involve removing workloads or adding completely new scenarios. MobileXPRT currently includes five performance scenarios (Apply Photo Effects, Create Photo Collages, Create Slideshow, Encrypt Personal Content, and Detect Faces to Organize Photos). Should we stick with these five or investigate other use cases? What do you think?

As we did with WebXPRT 3 and the upcoming HDXPRT 4, we’re also planning to update the MobileXPRT UI to improve the look of the benchmark and make it easier to use.

Crucially, we’ll also build the app using the most current Android Studio SDK. Android development has changed significantly since we released MobileXPRT 2015 and apps must now conform to stricter standards that require explicit user permission for many tasks. Navigating these changes shouldn’t be too difficult, but it’s always possible that we’ll encounter unforeseen challenges at some point during the process.

Do you have suggestions for test scenarios that we should consider for MobileXPRT? Are there existing features we should remove? Are there elements of the UI that you find especially useful or have ideas for improving? Please let us know. We want to hear from you and make sure that MobileXPRT continues to meet your needs.

Justin

The Exploring WebXPRT 3 white paper is now available

Today, we published the Exploring WebXPRT 3 white paper. The paper describes the differences between WebXPRT 3 and WebXPRT 2015, including changes we made to the harness and the structure of the six performance test workloads. We also explain the benchmark’s scoring methodology, how to automate tests, and how to submit results for publication. Readers will also find additional detail about the third-party functions and libraries that WebXPRT uses during the HTML5 capability checks and performance workloads.

Because data collection and privacy concerns are more relevant than ever, we also discuss the WebXPRT data collection mechanisms and our commitment to respecting testers’ privacy. Finally, for readers who may be unfamiliar with the XPRTs, we describe the other benchmark tools in the XPRT family, the role of the BenchmarkXPRT Development Community, and how you can contribute to the XPRTs.

Along with the WebXPRT 3 results calculation white paper and spreadsheet, the Exploring WebXPRT 3 white paper is designed to promote the high level of transparency and disclosure that is a core value of the BenchmarkXPRT Development Community. Both WebXPRT white papers and the results calculation spreadsheet are available on WebXPRT.com and on our XPRT white papers page. If you have any questions about the WebXPRT, please let us know, and be sure to check out our other XPRT white papers.

Justin

Check out the other XPRTs:

Forgot your password?