One in Three Smartphones Will be AI Capable in 2020

According to the latest research from Counterpoint’s Components Tracker Service, one in three smartphones to be shipped in 2020 will natively embed machine learning and artificial intelligence (AI) capabilities at the chipset level.

Apple, with its Bionic system on chip (SoC), proliferating across its complete portfolio over the next couple of years, will drive native AI adoption in smartphones. Its universal adoption of AI-capable SoCs will likely enable Apple to lead the AI-capable chip market through 2020.

Huawei, with its HiSilicon Kirin 970 SoC, launched in September and finding application in the Huawei Mate 10 series launched today in Munich, is second to market after Apple with AI-capable smartphones. The Huawei Mate 10 is able to accomplish diverse computational tasks efficiently, thanks to the neural processing unit at the heart of the Kirin 970 SoC.

However, Qualcomm will unlock AI capabilities in its high to mid-tier SoCs within the next few months. It should be able to catch-up and is expected to be second in the market in terms of volume by 2020, followed by Samsung and Huawei.

Machine learning and AI have not yet made major headway in the smartphone applications until the second half of 2017 due to the limited processing power of smartphone CPUs, meaning the user experience would have been hindered. AI applications require huge amounts of data processing even for a small task.

Sending and receiving that information from cloud-based data centers is potentially difficult, time consuming and requires a solid connection, which is not always available. The answer is to have the AI-capability on-board the device.

Counterpoint Research director, Jeff Fieldhack, noted: "The initial driver for the rapid adoption of AI in smartphones is the use of facial recognition technology by Apple in its recently launched iPhone X. Face recognition is computationally intensive and if other vendors are to follow Apple’s lead, they will need to have similar on-board AI capabilities to enable a smooth user experience."

Research director, Peter Richardson, said: "With the advanced SoC-level AI capabilities, smartphones will be able to perform a variety of tasks such as processing natural languages, including real-time translation; helping users take better photos by intelligently identifying objects and adjusting camera settings accordingly. But this is just the start. Machine learning will make smartphones understand user behaviour in an unprecedented manner.

"Analysing user behaviour patterns, devices will be able to make decisions and perform tasks that will reduce physical interaction time between the user and the device. Virtual assistants will become smarter by analysing and learning user behaviour, thereby uniquely serving each user according to their needs. This could potentially help virtual assistants take the leap and become a main-stream medium of interaction between the user and device."

Also Read

Stay in the know with our newsletter