AnTuTu releases a benchmark to test the AI performance of smartphones

AnTuTu releases a benchmark to test the AI performance of smartphones

The past years have seen more and more companies focus on AI (Artificial Intelligence) as a means to distinguish their product from the competition. Uses of AI range from understanding voice commands to recognizing scenarios to executing direct orders, thus rendering it necessary to reduce the friction between a customer and a service. Because of its exploding popularity, AI is now widely misused as a buzzword, and it is about time some system of measuring this functionality is set up.

AnTuTu, known for its popular benchmarking app, has taken it upon itself to provide a quantifiable standard for everyone to judge the difference in AI performance of different platforms. To set the foundation stones for this purpose, AnTuTu has worked with chip manufacturers to release a benchmarking app called “AI Review” that focuses on measuring the AI performance of smartphones.

Download AnTuTu’s AI Review Benchmark

AnTuTu’s blog post for AI Review begins by pointing out the difficulties in measuring something as vast as Artificial Intelligence. Currently in the smartphone segment, there does not exist any unified standard for AI, which in turn has led to a situation wherein every chip manufacturer has their own understanding and implementation of AI. Qualcomm handles some AI operations through the Hexagon DSP; Huawei’s HiSilicon handles it through an independant NPU; Samsung and MediaTek also handle AI operations through dedicated chips referred to as NPU and APU respectively. This situation is further complicated by the synergy between hardware and software, which is crucial for effective AI performance. Each vendor provides their own SDK for AI — Qualcomm has SNPE, MediaTek has NeuroPilot, HiSilicon has HiAI, and so on.

AnTuTu’s AI Review benchmark is divided into two sub-categories: Image Classification and Object Recognition. The Image Classification test reviews test data comprising of 200 images, and is based on the Inception v3 neural network, while the Object Recognition test reviews a 600-frame video and is based on the MobileNet SSD neural network. These neural networks are then translated into the neural network that is supported by the manufacturer through the SDK provided by the vendor. If the chip does not support AI-related algorithms, the benchmark app uses TFLite for benchmarking, the results of which AnTuTu themselves warn as being unsatisfactory and unreliable.

The benchmark scoring is directly related to both speed and accuracy. If accuracy is traded off for speed, AnTuTu assigns penalties to the score. This would discourage AI benchmark cheating that would have relied on simply providing quick but wrong results.

AnTuTu has also laid down a few special remarks for the use of its app. Platforms that use the same AI processor are unlikely to have large score gaps as the benchmark does not simply test performance, but focuses on AI-performance. Samsung has not yet released its AI SDK, and HiSilicon is utilizing TFLite for certain functions, which means that their scores are going to be low till those situations are improved. The base Android version of the device will also have an effect on the score as Google itself has been optimizing the support of AI at the system level.

Even from AnTuTu’s own blog post, it is clear that the goal to measure AI-based performance may not be possible by simply boiling it down to a number. There are a lot of variables involved in AI-based computation, which is adding another layer of complexity to the already-complex interaction between different hardware and software solutions. The singular numerical score that would come out of a benchmarking activity would not do justice to the nuances involved in the world of AI. So while you may look at your score and monetarily feel a measure of pride, know that we are still in the relatively early stages of AI, and even more so, of AI benchmarking.

If you’re looking to read more about AI, AI benchmarking and the challenges involved, check out our interview with Qualcomm’s Travis Lanier and Gary Brotman and Ziad Asghar.

Source: AnTuTu

Discuss This Story

Want more posts like this delivered to your inbox? Enter your email to be subscribed to our newsletter.

READ THIS NEXT