BenchmarkXPRT Blog banner

Tag Archives: benchmark

Adobe PSE 2020 and HDXPRT 4

HDXPRT 4, our benchmark for assessing Windows performance on real-world media tasks, runs tests that use real commercial applications such as Adobe Photoshop Elements (PSE) 2020. Last fall, we informed HDXPRT testers that Adobe had started requiring a user ID to download the free Adobe Photoshop Elements 2020 trial package. Previously, testers could download the trial without setting up an account.

Recently, Adobe made additional changes to the access path for the PSE 2020 installation package. The package is no longer available on the PSE downloads page, but users who previously purchased their copy or registered it with Adobe can access the package on another page. However, this approach does not work for users who want to temporarily use the trial version for HDXPRT 4 testing.

We have found a third-party location, ProDesignTools, that currently offers a free, straightforward PSE 2020 installation package download with no requirements for registration or transmission of personal information. In our testing so far, the installation package (PhotoshopElements_2020_LS30_win64_ESD.zip) has been functioning as expected, and HDXPRT 4 is running the PSE-based workloads without any issues.

Unfortunately, we cannot guarantee that ProDesignTools will continue to offer a free PSE 2020 installation package download, and we’re not aware of an alternative Adobe download path at this time. We apologize for the inconvenience!

Justin

XPRTs in the press

Each month, we send a newsletter to members of the BenchmarkXPRT Development Community. In the newsletter, we recap the latest updates from the XPRT world and provide a summary of the previous month’s XPRT-related activity, including uses or mentions of the XPRTs in the tech press. More people read the weekly XPRT blog than receive the monthly newsletter, so we realized that some blog readers may be unaware of the wide variety of tech outlets that regularly use or mention the XPRTs.

So for today’s blog, we want to give readers a sampling of the XPRT press usage we see on a weekly basis. Recent mentions include:

  • Tom’s Guide used HDXPRT 4 to compare the performance of the Geekom Mini IT8 and Dell OptiPlex 7090 Ultra small-form-factor PCs.
  • Intel used WebXPRT 4 test data in promotional material for their line of 12th Gen) Intel Core processors(Alder Lake). Hundreds of press outlets then republished the presentation.
  • AnandTech used WebXPRT 4 to evaluate the Cincoze DS-1300 Industrial PC.
  • ZDNet used CrXPRT 2 in a review titled The best Chromebooks for students: Student-proof laptops.
  • PCWorld used CrXPRT 2 to provide data for an article listing their top Chromebook recommendations.
  • TechPowerUp used WebXPRT 3 to compare the browser performance of Intel Core i9-12900KS processor-based systems and other Intel- and AMD processor-based systems.
  • Other outlets that have published articles, ads, or reviews mentioning the experts in the last few months include: Android Authority, ASUS, BenchLife, Gadgets 360, Good Gear Guide, Hardware.info, Hot Hardware, ITHardware (Poland), ITMedia (Japan), Itndaily (Russia), Mobile01.com (China), Notebookcheck, PCMag, ProClockers, Sohu.com (China), Tom’s Hardware, and Tweakers.

If you don’t currently receive the monthly BenchmarkXPRT newsletter, but would like to join the mailing list, please let us know! We will not publish or sell any of the contact information you provide, and will only send the monthly newsletter and occasional benchmark-related announcements such as patch notifications or new benchmark releases.

Justin

Chrome OS support for CrXPRT apps ends in June 2022

Last March, we discussed the Chrome OS team’s original announcement that they would be phasing out support for Chrome Apps altogether in June 2021, and would shift their focus to Chrome extensions and Progressive Web Apps. The Chrome OS team eventually extended support for existing Chrome Apps through June 2022, but as of this week, we see no indication that they will further extend support for Chrome Apps published with general developer accounts. If the end-of-life schedule for Chrome Apps does not change in the next few months, both CrXPRT 2 and CrXPRT 2015 will stop working on new versions of Chrome OS at some point in June.

To maintain CrXPRT functionality past June, we would need to rebuild the app completely—either as a Progressive Web App or in some other form. For this reason, we want to reassess our approach to Chrome OS testing, and investigate which features and technologies to include in a new Chrome OS benchmark. Our current goal is to gather feedback and conduct exploratory research over the next few months, and begin developing an all-new Chrome OS benchmark for publication by the end of the year.

While we will discuss ideas for this new Chrome OS benchmark in future blog posts, we welcome ideas from CrXPRT users now. What features or workloads would you like the new benchmark to retain? Would you like us to remove any components from the existing benchmark? Does the battery life test in its current form suit your needs? If you have any thoughts about these questions or any other aspects of Chrome OS benchmarking, please let us know!

Justin

WebXPRT 4 is live!

We’re excited to announce that WebXPRT 4 is now available! Testers can access the benchmark at WebXPRT.com. If you’ve already been using the WebXPRT 4 Preview, your Preview test results will be comparable with results from the current official build.

Longtime WebXPRT users will notice that WebXPRT 4 has a new, but familiar, UI. The general process for kicking off both manual and automated tests is the same as with WebXPRT 3, so the transition from WebXPRT 3 to WebXPRT 4 testing should be straightforward. We will continue to make WebXPRT 3 available for legacy testing.

If you missed earlier XPRT blog posts about WebXPRT 4, here is a quick overview of the differences between WebXPRT 3 and WebXPRT 4:

General changes

  • We’ve updated the aesthetics of the WebXPRT UI to make WebXPRT 4 visually distinct from older versions. We did not significantly change the flow of the UI.
  • We’ve updated content in some of the workloads to reflect changes in everyday technology, such as upgrading most of the photos in the photo processing workloads to higher resolutions.
  • We’ve updated the base calibration system for score calculations, and adjusted the scoring scale. WebXPRT 4 scores should not be compared to scores from previous versions of WebXPRT.

Workload changes

  • Photo Enhancement. We increased the efficiency of the workload’s Canvas object creation function, and replaced the existing photos with new, higher-resolution photos.
  • Organize Album Using AI. We replaced ConvNetJS with WebAssembly (WASM) based OpenCV.js for both the face detection and image classification tasks. We changed the images for the image classification tasks to images from the ImageNet dataset.
  • Stock Option Pricing. We updated the dygraph.js library.
  • Sales Graphs. We made no changes to this workload.
  • Encrypt Notes and OCR Scan. We replaced ASM.js with WASM for the Notes task and updated the WASM-based Tesseract version for the OCR task.
  • Online Homework. In addition to the existing scenario which uses four Web Workers, we have added a scenario with two Web Workers. The workload now covers a wider range of Web Worker performance, and we calculate the score by using the combined run time of both scenarios. We also updated the typo.js library.

We’re thankful for all of the feedback we received during the WebXPRT 4 development process, and we look forward to seeing your WebXPRT 4 results!

Justin

Updated system configuration recommendations for CrXPRT 2 battery life tests

Recently, we heard from a BenchmarkXPRT Development Community member who was testing Chromebooks in their lab. On a few of the Chromebooks, they saw sporadic CrXPRT 2 battery life test failures where CrXPRT 2 would successfully complete a battery life test and produce a result for the initial run, but then fail at the end of later runs.

After a considerable amount of troubleshooting, they determined that the issue seemed to be related to the way some systems automatically shut down before the battery is completely exhausted, and the way some systems will automatically boot up once the tester plugs in the power adapter for charging. This member found that when they added a few system configuration steps before battery life tests and made slight changes to their post-test routine, the systems that had previously experienced consistent failures would successfully complete battery life tests and produce results.

The added steps are quick and straightforward, and we decided to add them to the Configuring the test device and Running the tests sections of the CrXPRT 2 user manual. We hope this updated guidance will help to prevent future frustration for CrXPRT 2 testers.

If you have any questions or comments about the CrXPRT 2 battery life test, please feel free to contact us!

Justin

Why we don’t control screen brightness during CrXPRT 2 battery life tests

Recently, we had a discussion with a community member about why we no longer recommend specific screen brightness settings during CrXPRT 2 battery life tests. In the CrXPRT 2015 user manual, we recommended setting the test system’s screen brightness to 200 nits. Because the amount of power that a system directs to screen brightness can have a significant impact on battery life, we believed that pegging screen brightness to a common standard for all test systems would yield apple-to-apples comparisons.

After extensive experience with CrXPRT 2015 testing, we decided to not recommend a standard screen brightness with CrXPRT 2, for the following reasons:

  • A significant number of Chromebooks cannot produce a screen brightness of 200 nits. A few higher-end models can do so, but they are not representative of most Chromebooks. Some Chromebooks, especially those that many school districts and corporations purchase in bulk, cannot produce a brightness of even 100 nits.
  • Because of the point above, adjusting screen brightness would not represent real-life conditions for most Chromebooks, and the battery life results could mislead consumers who want to know the battery life they can expect with default out-of-box settings.
  • Most testers, and even some labs, do not have light meters, and the simple brightness percentages that the operating system reports produce different degrees of brightness on different systems. For testers without light meters, a standardized screen brightness recommendation could discourage them from running the test.
  • The brightness controls for some low-end Chromebooks lack the fine-tuning capability that is necessary to standardize brightness between systems. In those cases, an increase or decrease of one notch can swing brightness by 20 to 30 nits in either direction. This could also discourage testing by leading people to believe that they lack the capability to correctly run the test.

In situations where testers want to compare battery life using standardized screen brightness, we recommend using light meters to set the brightness levels as closely as possible. If the brightness levels between systems vary by more than few nits, and if the levels vary significantly from out-of-box settings, the publication of any resulting battery life results should include a full disclosure and explanation of test conditions.

For the majority of testers without light meters, running the CrXPRT 2 battery life test with default screen brightness settings on each system provides a reliable and accurate estimate of the type of real-world, out-of-box battery life consumers can expect.

If you have any questions or comments about the CrXPRT 2 battery life test, please feel free to contact us!

Justin

Check out the other XPRTs:

Forgot your password?