Microsoft has unveiled Brainwave, a hardware platform for real-time AI
Microsoft has unveiled a new deep learning hardware acceleration system named Brainwave. The system is designed in such a manner that it processes data and other requests at a very high speed, with ultra-low latency.
Researchers at the Hot Chips Conference showed that the new platform delivers 39.5 teraflops of computing power with 1ms of latency without batching operations at all using the Intel Stratix 10 FPGA (Field-Programmable Gate Array).
“We call it real-time AI because the idea here is that you send in a request, you want the answer-back,” said Doug Burger, an engineer at Microsoft Research.
“If it’s a video stream, if it’s a conversation, if it’s looking for intruders, anomaly detection, all the things where you care about interaction and quick results, you want those in real time,” he further asserted said.
With this new hardware platform, you can make machine learning systems according to your preferences. Brainwave is compatible with Microsoft’s own Artificial Intelligence (AI) system Cognitive Toolkit and can also work with other AI systems such as Google’s TensorFlow. Moreover, Brainwave will let developers embed machine learning models in chips to get higher performance than what regular CPUs and GPUs offer.
The company said that it is planning to release this Brainwave system on its Cloud service Azure so that the customers can get benefit from it directly.
The company said in a post,
“We are working to bring this powerful, real-time AI system to users in Azure, so that our customers can benefit from Project Brainwave directly, complementing the indirect access through our services such as Bing. In the near future, we’ll detail when our Azure customers will be able to run their most complex deep learning models at a record-setting performance. With the Project Brainwave system incorporated at scale and available to our customers, Microsoft Azure will have industry-leading capabilities for real-time AI.”
Read more about Brainwave here.