Some folks prefer to get a grip on things to better understand concepts. Researchers have developed smart gloves for tactile learners that use haptic feedback and AI to teach users new skills, fast-track precision training and control robots remotely.
We’re all different, and that affects how we approach learning. Generally speaking, there are those who benefit most from observing or seeing things, others who take in more if the information is reinforced by sound, some absorb most when stuff is written down or through writing out concepts themselves. And then there are folks who prefer to get handsy or learn by doing. Or combinations of the above.
A team that includes researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed sensor-packed smart gloves to help kinesthetic – or tactile – learners better grasp new tasks or skills.
“Humans engage in a wide variety of tasks by constantly interacting with the world around them,” said lead author of a paper on the project, Yiyue Luo – a Ph.D student in the Department of Electrical Engineering and Computer Science at MIT, and a CSAIL affiliate.
“We don’t usually share these physical interactions with others. Instead, we often learn by observing their movements, like with piano-playing and dance routines. The main challenge in relaying tactile interactions is that everyone perceives haptic feedback differently. This roadblock inspired us to develop a machine-learning agent that learns to generate adaptive haptics for individuals’ gloves, introducing them to a more hands-on approach to learning optimal motion.”
The smart gloves can be specifically tailored to each individual user to ensure a perfect fit. A computer produces a cutout based on hand measurements, which acts as a guide for a digital embroidery machine to stitch sensors and haptic actuators (like those found in smartphones) directly into soft fabric. The gloves are then ready to use in 10 minutes.
The embedded technology in each glove can be used to capture the hand and finger actions of a wearer – such as a teacher, skilled surgeon or machine operator – and those actions reproduced as haptic prompts for the learner thanks to an adaptive machine-learning agent trained on the haptic responses of 12 users. When fed new user data, this system can then tailor feedback responses for each individual in just 15 seconds.
In one demonstration of the setup, an accomplished piano player was asked to play a simple tune while the smart gloves captured the sequence as fingers were pressed down a limited span of keys. This data was then loaded into a student’s own smart gloves, who followed the finger buzzes to press down on keys when appropriate to learn the melody. This ‘follow me’ method could help beginners to quickly nail chords, scales, arpeggios and simple melodies, though full concertos might be a stretch.
In more experiments, the team found that laptop gamers achieved the highest scores when following time-sensitive prompts delivered through smart gloves as they navigated a narrow, winding path in a rhythm game or collected coins while making sure their vehicle didn’t tip over on their way to the finish line of a racing game.
Cheating? Absolutely, but the technology could allow users to get a feel for VR worlds as well as offering “a more personalized and touch-based experience in virtual training courses used by surgeons, firefighters and pilots, where precision is paramount.” The smart gloves were also used to remotely train robotic systems to grasp like a human, such as teaching a robot arm how to pick up different kinds of bread without damaging them.
At the moment, the system can only handle fairly simple motions, but the team believes that an AI agent fed on more user data “could assist with more involved tasks, like manipulating clay or driving an airplane.” And with stronger haptic feedback, the technology could help guide less sensitive body parts like feet and hips.
A paper on the research has been published in the journal Nature Communications.
Source: MIT
Source of Article