0
Qualcomm today announced Cloud AI 100 accelerator for data center providers built from the ground up to help accelerate AI experiences, providing a turn-key solution that addresses the most important aspects of cloud AI inferencing—including low power consumption, scale, process node leadership, and signal processing expertise. In addition, it is supporting developers with a full stack of tools and frameworks for each of our cloud-to-edge AI solutions. Facilitating the development of the ecosystem in this distributed AI model will help enhance a myriad of potential experiences for the end-user, including personal assistants for natural language processing and translations, advanced image search, and personalized content and recommendations. The Qualcomm Cloud AI 100 offers: More than 10x performance per watt over the industry’s most advanced AI inference solutions deployed today An all new, highly efficient chip specifically designed for processing AI inference workloads 7nm process node bringing further performance and power advantages Available support for industry leading software stacks, including PyTorch, Glow, TensorFlow, Keras, and ONNX Power-efficient signal processing expertise across major areas: Artificial Intelligence, eXtended Reality, Camera, Audio, Video, Gestures The Qualcomm Cloud AI 100 is expected to begin sampling to customers in the second half of 2019.

Read Here»

Post a Comment Blogger

We welcome comments that add value to the discussion. We attempt to block comments that use offensive language or appear to be spam, and our editors frequently review the comments to ensure they are appropriate. As the comments are written and submitted by visitors of The Sheen Blog, they in no way represent the opinion of The Sheen Blog. Let's work together to keep the conversation civil.

 
Top