Researchers at University of California, Berkeley have created a new device that combines wearable biosensors with artificial intelligence (AI) software to recognise what hand gesture a person intends to make based on electrical signal patterns in the forearm. As per the research published on the UC Berkeley website, the device paves the way for "better prosthetic control and seamless interaction" with electronic devices. It essentially means that the technology can be used for carrying complex robotic medical procedures or other daily tasks such as typing or even gaming without actually using your hands.
The researchers claim that the team succeeded in teaching the algorithm to recognise 21 individual hand gestures, including a thumbs-up, a fist, a flat hand, holding up individual fingers and counting numbers. The device is not ready for commercial usage yet, though it could be available soon with few tweaks, researchers said. The research paper adds that there are other ways to improve human-computer interaction through cameras and computer vision, but the new device ensures an individual's privacy as it stores all data locally. Engineers claim that not only does this speed up the computing time, but it also ensures that personal biological data remain private.
Imagine typing on a computer w/o a keyboard or driving a car w/o a wheel... Berkeley researchers have created a device that combines wearable biosensors w/AI to recognize what hand gesture a person intends to make based on electrical signal patterns. https://t.co/MspLZJdGdk — UC Berkeley (@UCBerkeley) December 21, 2020
"When Amazon or Apple creates their algorithms, they run a bunch of software in the cloud that creates the model, and then the model gets downloaded onto your device," said Jan Rabaey, Professor of Electrical Engineering at UC Berkeley and senior author of the paper. "In our approach, we implemented a process where the learning is done on the device itself. And it is extremely quick. You only have to do it one time, and it starts doing the job. But if you do it more times, it can get better. So, it is continuously learning, which is how humans do it," he noted.
The team collaborated with Ana Arias, a Professor of Electrical Engineering at UC Berkeley to create the hand gesture recognition system. Together, the team designed a flexible armband that can read the electrical signals at 64 different points on the forearm. The electrical signals are then fed into an electrical chip, which is programmed with an AI algorithm capable of associating these signal patterns in the forearm with specific hand gestures. Additionally, the device uses a type of advanced AI called a 'hyperdimensional computing algorithm' that is capable of updating itself with new information. "In gesture recognition, your signals are going to change over time, and that can affect the performance of your model. We were able to greatly improve the classification accuracy by updating the model on the device," said Ali Moin who helped design the device as a doctoral student.
Lastly, researchers claim that the uniqueness of the device is that it integrates biosensing, signal processing and interpretation, and AI into one system that is relatively small and flexible and has a low power budget.