Silicon Valley-based Synaptics (NASDAQ : SYNA) has announced a collaboration with Google on Edge AI for the Internet of Things (IoT), to define the optimal implementation of multimodal processing for context-aware computing.
The deal with Google will integrate the latter’s MLIR-compliant ML (machine learning) core on the Synaptics Astra hardware suite with open-source software and tools.
A game changer for Synaptics?
What does it mean for the company? It means Synaptics hardware will be faster, as Edge AI is the paradigm for bringing computation and data storage to as close to the end user as possible - as opposed to Cloud AI, where information and processing is done from a more centralised server or data centre.
Think of it as decentralisation for AI devices that support faster and more seamless processing of vision, image, voice, sound, and other modalities for the rollout of the next wave of augmented reality tech.
For end users, Edge AI augments interactivity in applications such as wearables, appliances, entertainment, embedded hubs, monitoring, and control across consumer, automotive, enterprise, and industrial systems.
“We are on the brink of a transformative era in Edge AI devices, where innovation in hardware and software is unlocking context-aware computing experiences that redefine user engagement,” Synaptics Chief Product Officer Vikram Gupta said.
“Our partnership with Google reflects a shared vision to leverage open frameworks as a catalyst for disruption in the Edge IoT space.
“This collaboration underscores our commitment to delivering exceptional experiences while validating Synaptics’ silicon strategy and roadmap for next-generation device deployment.”
The company’s Astra portfolio is a suite of AI-native processors for wireless connectivity, video, graphics, audio and other end user hardware.
Synaptics shares surged 6% on the news.
