New Chip Improves Efficiency of AI-powered Edge Computing

Recently, a study of in-memory computing chips that can run artificial intelligence applications was first published in Nature. The new chip, developed in collaboration with engineers from Stanford University and the University of California, can significantly improve the computing efficiency of A.I. Chips for artificial intelligence play an important role in people’s daily lives. Devices such as drones, wearable devices, and the internet are equipped with A.I.-enabled chips. However, the A.I. capabilities on these devices are limited by battery power. Therefore, improving energy efficiency is important. In current A.I. chips, data processing and storage are in two different areas, computing units and storage units. Frequent data transmission between these units consumes most of the energy in the processing, so reducing data transmission is the key to solving energy consumption.

 Engineers at Stanford University have come up with a solution, a new type of resistive random-access memory chip. The chip performs artificial intelligence processing in the memory itself, bridging the gap between the computing unit and the memory unit. This “in-memory” chip, also called neural resistance random access memory, is about the size of a fingertip and can do more tasks than current chips with limited battery power.

 While in-memory computing has been around for decades, it is the first time that actually demonstrated the wide usage of A.I. applications in hardware, not just simulation. The efficiency, accuracy, and ability to perform multiple tasks have demonstrated that the new chip can potentially solve many problems, such as agricultural irrigation, home health monitoring, medical devices, and problems related from climate change to food security. Nevertheless, it needs more development before promoting to the market.

Picture of Aussie Brief News
Aussie Brief News

Go to First Page and Get the Latest News.

Translator: NFSC News
Design&editor: HBamboo(昆仑竹)

Leave a Reply

Your email address will not be published. Required fields are marked *