This week, we
published an updated MobileXPRT 3 build, version 3.116.0.4, on MobileXPRT.com and in the Google Play Store. The new build addresses an issue we recently discovered, where
MobileXPRT was crashing after installation on some Android 11 phones.
Permissions requirements and a new storage strategy called scoped storage were causing the problem. By default, scoped storage restricts
an app’s storage access to app-specific directories and media, and prohibits
general access to external or public directories. It also prevents third-party
apps such as email clients or file managers from accessing MobileXPRT 3 results
files. This default setting requires an opt-in permissions prompt that
MobileXPRT 3 did not have prior to this week’s release.
MobileXPRT 3.116.0.4 points all of the benchmark’s file references to its private directory and allows users to zip results files and attach them to results submission emails. Neither change affects the testing process or test scores. If you have any questions or comments about the new MobileXPRT 3 build, please let us know!
The biggest shopping
days of the year are fast approaching, and if you’re researching phones,
tablets, Chromebooks, or laptops in preparation for Black Friday and Cyber
Monday sales, the XPRTs can help! One of the core functions of the XPRTs is to
help cut through all the marketing noise by providing objective, reliable
measures of a device’s performance. For example, instead of trying to guess
whether a new Chromebook is fast enough to handle the demands of remote
learning, you can use its CrXPRT and WebXPRT performance scores to see how it stacks up against the
competition when handling everyday tasks.
A good place to start your
search for scores is our XPRT results browser. The browser is the most efficient way to access the XPRT
results database, which currently holds more than 2,600 test results from over 100
sources, including major tech review publications around the world, OEMs, and
independent testers. It offers a wealth of current and historical performance
data across all the XPRT benchmarks and hundreds of devices. You can read more
about how to use the results browser here.
Also, if you’re considering
a popular device, chances are good that someone has already published an XPRT
score for that device in a recent tech review. The quickest way to find these
reviews is by searching for “XPRT” within your favorite tech review site, or by
entering the device name and XPRT name (e.g. “Apple iPad” and “WebXPRT”) in a
search engine. Here are a few recent tech reviews that use one or more of the
XPRTs to evaluate a popular device:
LaptopMag
used WebXPRT in a Best student Chromebooks for back to school 2020 review. This article can
still be helpful if you’ve discovered that your child’s existing
Chromebook isn’t handling the demands of remote learning well.
The XPRTs can help consumers make better-informed and more confident tech purchases this holiday season, and we hope you’ll find the data you need on our site or in an XPRT-related tech review. If you have any questions about the XPRTs, XPRT scores, or the results database please feel free to ask!
We’re currently formulating our 2021 development roadmap for the XPRTs. In addition to planning CloudXPRT and WebXPRT updates, we’re discussing the possibility of releasing HDXPRT 5 in 2021. It’s hard for me to believe, but it’s been about two and a half years since we started work on HDXPRT 4, and February 2021 will mark two years since the first HDXPRT 4 release. Windows PCs are more powerful than ever, so it’s a good time to talk about how we can enhance the benchmark’s ability to measure how well the latest systems handle real-world media technologies and applications.
When we plan a new
version of an XPRT benchmark, one of our first steps is updating the
benchmark’s workloads so that they will remain relevant in years to come. We
almost always update application content, such as photos and videos, to
contemporary file resolutions and sizes. For example, we added both higher-resolution
photos and a 4K video conversion task in HDXPRT 4. Are there specific types of
media files that you think would be especially relevant to high-performance
media tasks over the next few years?
Next, we will assess
the suitability of the real-world trial applications that the editing photos,
editing music, and converting videos test scenarios use. Currently, these are Adobe
Photoshop Elements, Audacity, CyberLink MediaEspresso, and HandBrake. Can you
think of other applications that belong in a high-performance media processing
benchmark?
In HDXPRT 4, we gave
testers the option to target a system’s discrete graphics card during the video
conversion workload. Has this proven useful in your testing? Do you have
suggestions for new graphics-oriented workloads?
We’ll also strive to
make the UI more intuitive, to simplify installation, and to reduce the size of
the installation package. What elements of the current UI do you find
especially useful or think we could improve?
We welcome your answers to these questions and any additional suggestions or comments on HDXPRT 5. Send them our way!
It’s been nine months
since we’ve published a WebXPRT 3 browser performance comparison, so we decided
to put the newest versions of popular browsers through the paces to see if the
performance rankings have changed since our last round of tests.
We used the same
laptop as last time: a Dell XPS 13 7930 with an Intel Core i3-10110U processor and 4 GB of RAM
running Windows 10 Home, updated to version 1909 (18363.1139). We installed all
current Windows updates and tested on a clean system image. After the update
process completed, we turned off updates to prevent them from interfering with
test runs. We ran WebXPRT 3 three times on five browsers: Brave, Google Chrome,
Microsoft Edge, Mozilla Firefox, and Opera. The posted score for each browser
is the median of the three test runs.
In our last round of tests, the four Chromium-based browsers (Brave, Chrome, Edge, and Opera)
produced scores that were nearly identical. Only Mozilla Firefox produced a
significantly different (and better) score. The parity of the Chromium-based
browsers was not surprising, considering they have the same underlying foundation.
In this round of testing, the Chromium-based browsers again produced very close scores, although Brave’s performance lagged by about 4 percent. Firefox again separated itself from the pack with a higher score. With the exception of Chrome, which produced an identical score as last time, every browser’s score was slightly slower than before. There are many possible reasons for this, including increased overhead in the browsers or changes in Windows, and the respective slowdowns for each browser will probably be unnoticeable to most users during everyday tasks.
Do these results mean that Mozilla Firefox will provide you with a speedier web experience? As we noted in the last comparison, a device with a higher WebXPRT score will probably feel faster during daily use than one with a lower score. For comparisons on the same system, however, the answer depends in part on the types of things you do on the web, how the extensions you’ve installed affect performance, how frequently the browsers issue updates and incorporate new web technologies, and how accurately each browsers’ default installation settings reflect how you would set up that browser for your daily workflow.
In
addition, browser speed can increase or decrease significantly after an update,
only to swing back in the other direction shortly thereafter. OS-specific
optimizations can also affect performance, such as with Edge on Windows 10 and
Chrome on Chrome OS. All of these variables are important to keep in mind when
considering how browser performance comparison results translate to your
everyday experience.
What are your thoughts on browser performance? Let us know!
Last month, we announced that we’re working on
a new AIXPRT learning tool. Because we want tech journalists, OEM lab
engineers, and everyone who is interested in AIXPRT to be able to find the
answers they need in as little time as possible, we’re designing this tool to serve
as an information hub for common AIXPRT topics and questions.
We’re still finalizing
aspects of the tool’s content and design, so some details may change, but we
can now share a sneak peak of the main landing page. In the screenshot below,
you can see that the tool will feature four primary areas of content:
The FAQ section will provide quick answers to the questions we
receive most from testers and the tech press.
The AIXPRT basics section will describe specific topics such as the
benchmark’s toolkits, networks, workloads, and hardware and software
requirements.
The testing and results section will cover the testing process,
the metrics the benchmark produces, and how to publish results.
The AI/ML primer will provide brief, easy-to-understand definitions
of key AI and ML terms and concepts for those who want to learn more about the
subject.
We’re excited about the new AIXPRT learning tool, and will share more information here in the blog as we get closer to a release date. If you have any questions about the tool, please let us know!
We recently received a tech support inquiry about problems with new MobileXPRT 3 installations on some Android 11 phones. The tester installed MobileXPRT 3 on a selection of phones running Android 11, and the app crashed immediately upon opening. We were able to reproduce the issue on multiple phones in our lab, and currently know that the issue may happen on the Google Pixel 3, Google Pixel 4a 5G, Google Pixel 4XL, Google Pixel 5, and the OnePlus 8T (running Android 11 with an Oxygen OS skin).
MobileXPRT 3 continues
to run without issues on Android 9 and 10 phones. When we updated an Android 10
phone with an existing MobileXPRT 3 installation to Android 11, we found that
the benchmark ran successfully. This suggests a lack of fundamental
incompatibilities between MobileXPRT 3 and current versions of Android 11. Because
some of our lab techs experienced crashes immediately after the app asked for
permissions, we think it’s possible that new permissions-setting requirements
in Android 11 are causing the problem.
We’re currently
working to isolate the problem and identify a course of action. We’ll share
more information here in the blog as soon as possible. If you’ve encountered
this problem in your testing, we apologize for the inconvenience, and we’re
thankful for your patience as we work towards a solution.
If you have any information you’d like to share about running MobileXPRT 3 on Android 11, please let us know!
Cookie Notice: Our website uses cookies to deliver a smooth experience by storing logins and saving user information. By continuing to use our site, you agree with our usage of cookies as our privacy policy outlines.