Qualcomm Cloud AI 100 Aims To Upend Digital Intelligence

Qualcomm is looking to distill some of the magic of the cloud and its Snapdragon processors into an AI processor, and Microsoft and Facebook are already onboard with the new Cloud AI 100. A custom-made chip designed to bring cloud-style AI processing to the data center, the new accelerator promises applications from mixed reality, to gaming, to personalization.

Advertisement

While we've seen AI accelerators based on mobile chips before, the Qualcomm Cloud AI 100 doesn't tap into existing Snapdragon mobile chipsets. Instead it's an entirely new, purpose-built chip, Qualcomm says.

The result is peak AI performance in excess of 50x what you'd get from a Snapdragon 820. It's built on 7nm processes, and should be capable of more than 10x the performance, Qualcomm claims, over other AI inference solutions currently on the market, based on CPUs, GPUs, or FPGAs. Form-factor is decided by the client themselves, with different cards designed for different data centers.

The Cloud AI 100 supports the Glow, ONNX, and XLA runtimes, along with frameworks like PyTorch, Caffe2, Keras, and TensorFlow. It'll come with a whole suite of compilers, debaters, profilers, and more, making getting up to speed more straightforward. And, while it may not use Snapdragon chips explicitly, Qualcomm says it has dipped heavily into the power consumption, scale, and signal processing developers it has relied on for mobile chipsets to make the Cloud AI 100 more efficient.

Advertisement

Why is all that important? Mainly because the requirements we'll be making of artificial intelligence over the coming years will put serious pressure on the cloud-based systems currently prevalent. Numerous companies – all the big chip-makers among them – are looking to so-called edge computing to help address that.

Edge computing offsets some of the workload of the cloud, by doing at least some of the processing locally first. The obvious advantage is that the cloud then has less heavy-lifting to do. However, there are also big potential advantages in latency and bandwidth.

Because you're not transferring huge quantities of raw data, throughput can be improved and networks left less stressed. IDC is predicting that, by 2025, the collective sum of the world's data will reach 175 zettabytes, and a sizable portion of that will be relying on the cloud. The latency involved in transferring data to the cloud, having it processed there, and then receiving something meaningful in return can also be considerable. Doing it at the edge instead can cut that latency significantly.

That could pay dividends when it comes to things like developing personal assistants that can do natural language processing and translation, as well as advanced image searching. It'll also be able to better scale with personalized content and recommendations, Qualcomm promises. Other edge computing applications commonly discussed included things like autonomous car processing, with lower-latency systems used to deliver faster image recognition and more. However it won't just be as extreme as cars that drive themselves: Qualcomm is talking about how edge AI will help power driver awareness monitoring, along with features like personalized driver settings that learn from individual user preferences.

Advertisement

For consumers, obviously the Cloud AI 100 isn't something they'll be looking to put inside their next PC or phone. However the benefits reaped from it could be notable, depending on the services they use and subscribe to. We'll know more when Qualcomm begins sampling the new chip to consumers, which is expected to take place in the second half of 2019.

Recommended

Advertisement