If you’ve ever piloted a drone, you’ll know that utilizing a joystick-style controller takes some getting used to. MIT scientists have developed what they claim is a more intuitive control system, that reads the operator’s muscle signals.
Known as Conduct-A-Bot, the experimental setup utilizes multiple electromyography sensors (which detect electrical activity) and motion sensors – these are worn on the biceps, triceps and forearm region of the user’s right arm. Working together, the sensors detect muscle activity and arm movement, relaying that data to a hard-wired microprocessor.
Machine learning-based algorithms are then used to identify the different arm actions, each one of which the system has been preprogrammed to convert into a specific command. Those commands are in turn wirelessly transmitted to a Parrot Bebop 2 quadcopter, that responds accordingly.
In the current setup, stiffening the upper arm stops the drone; clenching the fist moves it forward; rotating the fist clockwise or counterclockwise causes it to turn; and waving the hand up, down, left or right moves it sideways or horizontally. In recent tests in which it was made to fly through hoops, the Bebop responded correctly to 82 percent of over 1,500 commands – that figure should improve as the system is developed further.
Any model of drone could be used, and in fact it is envisioned that the technology may ultimately be utilized in applications such as the control of assistive robots by the elderly or physically challenged.
“This system moves one step closer to letting us work seamlessly with robots so they can become more effective and intelligent tools for everyday tasks,” says graduate student Joseph DelPreto, lead author of a paper on the research. “As such collaborations continue to become more accessible and pervasive, the possibilities for synergistic benefit continue to deepen.”
Conduct-A-Bot is demonstrated in the video below.
Source: MIT CSAIL
Controlling Drone with Gestures
Source of Article