BenchmarkXPRT Blog banner

Category: Principled Technologies

WebXPRT in PT reports

We don’t just make WebXPRT—we use it, too. If you normally come straight to BenchmarkXPRT.com or WebXPRT.com, you may not even realize that Principled Technologies (PT) does a lot more than just managing and administering the BenchmarkXPRT Development Community. We’re also the tech world’s leading provider of hands-on testing and related fact-based marketing services. As part of that work, we’re frequent WebXPRT users.

We use the benchmark when we test devices such as Chromebooks, desktops, mobile workstations, and consumer laptops for our clients. (You can see a lot of that work and many of our clients on our public marketing portfolio page.) We run the benchmark for the same reasons that others do—it’s a reliable and easy-to-use tool for measuring how well devices handle web browsing and other web work.

We also sometimes use WebXPRT simply because our clients request it. They request it for the same reason the rest of us like and use it: it’s a great tool. Regardless of job titles and descriptions, most laptop and tablet users surf the web and access web-based applications every day. Because WebXPRT is a browser benchmark, higher scores on it could indicate that a device may provide a superior online experience.

Here are just a few of the recent PT reports that used WebXPRT:

  • In a project for Dell, we compared the performance of a Dell Latitude 7340 Ultralight to that of a 13-inch Apple MacBook Air (2022).
  • In this study for HP, we compared the performance of an HP ZBook Firefly G10, an HP ZBook Power G10, and an HP ZBook Fury G10.
  • Finally, in a set of comparisons for Lenovo, we evaluated the system performance and end-user experience of eight Lenovo ThinkBook, ThinkCentre, and ThinkPad systems along with their Apple counterparts.

All these projects, and many more, show how a variety of companies rely on PT—and on WebXPRT—to help buyers make informed decisions. P.S. If we publish scores from a client-commissioned study in the WebXPRT 4 results viewer, we will list the source as “PT”, because we did the testing.

By Mark L. Van Name and Justin Greene

XPRTs in the datacenter

The XPRTs have been very successful on desktops, notebooks, tablets, and phones. People have run WebXPRT over 295,000 times. It and other benchmarks such as MobileXPRT, HDXPRT, and CrXPRT are important tools globally for evaluating device performance on various consumer and business client platforms.

We’ve begun branching out with tests for edge devices with AIXPRT, our new artificial intelligence benchmark. While typical consumers won’t be able to run AIXPRT on their devices initially, we feel that it is important for the XPRTs to play an active role in a critical emerging market. (We’ll have some updates on the AIXPRT front in the next few weeks.)

Recently, both community members and others have asked about the possibility of the XPRTs moving into the datacenter. Folks face challenges in evaluating the performance and suitability to task of such datacenter mainstays as servers, storage, networking infrastructure, clusters, and converged solutions. These challenges include the lack of easy-to-run benchmarks, the complexity and cost of the equipment (multi-tier servers, large amounts of storage, and fast networks) necessary to run tests, and confusion about best testing practices.

PT has a lot of expertise in measuring datacenter performance, as you can tell from the hundreds of datacenter-focused test reports on our website. We see great potential in our working with the BenchmarkXPRT Development Community to help in this area. It is very possible that, as with AIXPRT, our approach to datacenter benchmarks would differ from the approach we’ve taken with previous benchmarks. While we have ideas for useful benchmarks we might develop down the road, more immediate steps could be drafting white papers, developing testing guidelines, or working with vendors to set up a lab.

Right now, we’re trying to gauge the level of interest in having such tools and in helping us carry out these initiatives. What are the biggest challenges you face in datacenter-focused performance and suitability to task evaluations? Would you be willing to work with us in this area? We’d love to hear from you and will be reaching out to members of the community over the coming weeks.

As always, thanks for your help!

Bill

Celebrating one year of the XPRT Weekly Tech Spotlight

It’s been just over a year since we launched the XPRT Weekly Tech Spotlight by featuring our first device, the Google Pixel C. Spotlight has since become one of the most popular items at BenchmarkXPRT.com, and we thought now would be a good time to recap the past year, offer more insight into the choices we make behind the scenes, and look at what’s ahead for Spotlight.

The goal of Spotlight is to provide PT-verified specs and test results that can help consumers make smart buying decisions. We try to include a wide variety of device types, vendors, software platforms, and price points in our inventory. The devices also tend to fall into one of two main groups: popular new devices generating a lot of interest and devices that have unique form factors or unusual features.

To date, we’ve featured 56 devices: 16 phones, 11 laptops, 10 two-in-ones, 9 tablets, 4 consoles, 3 all-in-ones, and 3 small-form-factor PCs. The operating systems these devices run include Android, ChromeOS, iOS, macOS, OS X, Windows, and an array of vendor-specific OS variants and skins.

As much as possible, we test using out-of-the-box (OOB) configurations. We want to present test results that reflect what everyday users will experience on day one. Depending on the vendor, the OOB approach can mean that some devices arrive bogged down with bloatware while others are relatively clean. We don’t attempt to “fix” anything in those situations; we simply test each device “as is” when it arrives.

If devices arrive with outdated OS versions (as is often the case with Chromebooks), we update to current versions before testing, because that’s the best reflection of what everyday users will experience. In the past, that approach would’ve been more complicated with Windows systems, but the Microsoft shift to “Windows as a service” ensures that most users receive significant OS updates automatically by default.

The OOB approach also means that the WebXPRT scores we publish reflect the performance of each device’s default browser, even if it’s possible to install a faster browser. Our goal isn’t to perform a browser shootout on each device, but to give an accurate snapshot of OOB performance. For instance, last week’s Alienware Steam Machine entry included two WebXPRT scores, a 356 on the SteamOS browser app and a 441 on Iceweasel 38.8.0 (a Firefox variant used in the device’s Linux-based desktop mode). That’s a significant difference, but the main question for us was which browser was more likely to be used in an OOB scenario. With the Steam Machine, the answer was truly “either one.” Many users will use the browser app in the SteamOS environment and many will take the few steps needed to access the desktop environment. In that case, even though one browser was significantly faster than the other, choosing to omit one score in favor of the other would have excluded results from an equally likely OOB environment.

We’re always looking for ways to improve Spotlight. We recently began including more photos for each device, including ones that highlight important form-factor elements and unusual features. Moving forward, we plan to expand Spotlight’s offerings to include automatic score comparisons, additional system information, and improved graphical elements. Most importantly, we’d like to hear your thoughts about Spotlight. What devices and device types would you like to see? Are there specs that would be helpful to you? What can we do to improve Spotlight? Let us know!

Justin

XPRT Women Code-a-Thon: Make your voice heard and win a cash prize

DURHAM, NC –(Marketwired – March 01, 2016) – The BenchmarkXPRT Development Community and ChickTech are co-hosting the XPRT Women Code-a-Thon on March 12-13 in Seattle. The code-a-thon encourages Seattle software programmers to create small apps, or “workloads,” that mimic actions they take on their devices every day.

The top three participants or teams will receive cash prizes of up to $2,500, and all participants’ workloads will be considered for inclusion in future versions of the BenchmarkXPRT tools, or XPRTs. Any programmer familiar with Web development or Android development is encouraged to participate.

The XPRTs are apps that empower people all over the world to test how well devices handle everyday activities. They do this by running workloads that simulate common tasks – just like the workloads code-a-thon participants will be building.

“We want the XPRTs to reflect how people actually use their technology every day,” said Jennie Faries. Faries is one of the code-a-thon’s judges and a developer at Principled Technologies, which administers the BenchmarkXPRT Development Community. “By gaining the perspectives of this group of women, we’re making the tools stronger and more realistic. And when the tools we use to measure technology get better, the technology itself gets better too.”

All participants will receive a t-shirt and locally sourced breakfast and lunch on both days of the code-a-thon. The event will include time for networking and conclude with a talk from a special keynote speaker.

Add your voice to the tools that measure today’s hottest tech. Register today at facts.pt/XPRTcodeathon2016_registration, learn more at facts.pt/XPRT codeathon2016, and get all the details at facts.pt/XPRTcodeathon2016_FAQ.

About ChickTech

ChickTech envisions a safe, inclusive, and innovative technology future that includes equal pay, participation, and treatment of women. It is dedicated to retaining women in the technology workforce and increasing the number of women and girls pursuing technology-based careers. For more information, please visit http://chicktech.org

About the BenchmarkXPRT Development Community

The BenchmarkXPRT Development Community is a forum where registered members can contribute to the process of creating and improving the XPRTs. For more information, please visit http://www.principledtechnologies.com/benchmarkxprt

About Principled Technologies, Inc.

Principled Technologies, Inc. is a leading provider of technology marketing and learning & development services. It administers the BenchmarkXPRT Development Community.

Principled Technologies, Inc. is located in Durham, North Carolina, in NC’s Research Triangle Park region. For more information, please visit www.PrincipledTechnologies.com.

Company Contact
Jennie Faries
Principled Technologies, Inc.
1007 Slater Road, Suite #300
Durham, NC 27703

BenchmarkXPRT Development Community releases TouchXPRT 2016, a benchmark for web-enabled devices

DURHAM, NC–(Marketwired – February 16, 2016) – The BenchmarkXPRT Development Community, administered by Principled Technologies (PT), is pleased to announce the release of the TouchXPRT 2016 benchmark. TouchXPRT 2016 is a free benchmark tool for evaluating the performance of Windows 10 and Windows 10 Mobile devices. TouchXPRT 2016 runs tests based on five everyday scenarios (Beautify Photos, Blend Photos, Convert Videos for Sharing, Create Music Podcast, and Create Slideshow from Photos) and produces results for each of the five test scenarios plus an overall score.

To learn more about TouchXPRT 2016 and download the benchmark, please visit TouchXPRT.com. TouchXPRT 2016 is also available as a free Windows app in the Microsoft Store.

To learn more about and join the BenchmarkXPRT Development Community, go to www.BenchmarkXPRT.com.

About Principled Technologies, Inc.
Principled Technologies, Inc. is a leading provider of technology marketing and learning & development services.

Principled Technologies, Inc. is located in Durham, North Carolina, in NC’s Research Triangle Park region. For more information, please visit www.PrincipledTechnologies.com.

Company Contact
Eric Hale
Principled Technologies, Inc.
1007 Slater Road, Suite #300
Durham, NC 27703

We haven’t mentioned this in a while

I had a conversation with a community member yesterday who wanted to know whether we would test his device with one of the XPRTs. The short answer is “Absolutely!” The somewhat longer answer follows.

If you send us a device you want us to test, we will do so, with the appropriate set of XPRTs, free of charge. You will know that an impartial, third-party lab has tested your device using the best benchmarking practices. After we share the results with you, you will have three options: (1) to keep the results private, (2) to have us make the results public immediately in the appropriate XPRT results databases, or (3) to delay releasing the results until a future date. Regardless of your choice, we will keep the device so that we can use it as part of our testbed for developing and testing future versions of the XPRTs.

When we add the results to our online databases, we will cite Principled Technologies as the source, indicating that we stand behind the results.

The free testing includes no collateral beyond publishing the results. If you would like to publicize them through a report, an infographic, or any of the other materials PT can provide, just let us know and the appropriate person will contact you to discuss the how much those services would cost.

If you’re interested in getting your device tested for free, contact us at BenchmarkXPRTSupport@principledtechnologies.com.

Eric

Check out the other XPRTs:

Forgot your password?